Today we are starting a new column at Air Facts called Washington Report. Our aim is to alert you to new or proposed changes to regulations that affect pilots. This includes Notices of Proposed Rulemaking (NPRMs), Aviation Rulemaking Committee (ARC) reports and more. No boring legislative discussions here–just the facts and a chance for you to weigh in on important issues. –Ed.

Written test ARC report

The ARC recently released its report on written testing.

In 2011, a rash of Knowledge Test (“the written” to long-time pilots) failures at numerous flight schools caused a bit of a stir, and the FAA admitted that they had added a number of new questions to the test question database without notifying test prep providers or flight instructors. Some of these questions were downright silly, with no relationship to real world flying or even textbook answers. The written test has often been derided for being out of date and out of touch (a question on celestial navigation is still in the database), but the most recent controversy exposed the flawed system that creates test questions in the first place.

In response, an Aviation Rulemaking Committee (ARC) was formed by the FAA to study the issue of Knowledge Tests in general: how the questions are developed, what the goal of these tests is and how to keep them up to date. The ARC recently released its full report (read it here), with dozens of criticisms. A summary of the group’s recommendations is below:

  • Establish a stakeholder body of subject matter experts and relevant FAA policy offices to assist with the development and boarding of knowledge questions, review standards and handbook content, and assist with the review of the current bank of FAA test questions for validity and quality.
  • The ARC recommends the FAA revise the quality management system (QMS) process through which key policy offices make recommendations to the FAA Airman Testing Standards Branch about needed changes to training and testing documents.
  • The ARC recommends the FAA transition to a single testing standards document, the airman certification standards (an update to the PTS), to include knowledge, skills, and risk management.
  • Concerning the philosophy of question development, the ARC recommends the FAA should write questions that are pertinent to safe operations and necessary for sound airmanship and maintain discretion to write questions that reference multiple documents such that applicants must correlate data.  Further, FAA should adopt a continuous review process to ensure test questions are relevant to the current technology used in aviation with priority given to removing obsolete information from the tests.
  • The ARC recommends the FAA improve the feedback mechanism subsequent to knowledge testing by providing the applicant and instructor the specific missed question(s) to identify the deficient knowledge by review of the Airman Knowledge Test Report.  Aggregate results of knowledge testing failure areas should be published to provide a mechanism through which training organizations, providers, and publishers can improve and better target their instruction.

What do you think about the state of written testing? Does it focus too much on obscure regulations and not enough on real world piloting skills? Should we cut down on the questions about NDBs and finally add some about GPS? And should all of these questions be published for study or kept secret? Add your opinion below.

Latest posts by John Zimmerman (see all)
12 replies
  1. John Haley
    John Haley says:

    I thought that most of the aspects that I studied for, and questions that were on the test were, in fact, relevant. I studied long and hard for the test using multiple manuals and DVD’s. I made sure that I fully understood all phases of it before scheduling the test. I wonder if those who are having difficulty with it really studied anywhere near adequately, or is this another dumbing down in order to make it easy to pass.

  2. Bob Huddleston
    Bob Huddleston says:

    Recently took the instrument test and was surprised by what was NOT tested. I studied long and hard and did well on the exam, but my test had 6 different questions on how to test the VOR (VOT, dual VORs, airborne, etc.) which struck me as a bit of overkill in light of all that could have been tested. Not complaining since I got those right, but hard to understand why 10% of my IFR knowledge evaluation was on just one limited subject area. I would think something could be done to improve the randomness of the computer question selection process.

  3. Chad Cupp
    Chad Cupp says:

    I think that the test is outdated and does not prove or disprove how well you pilot an airplane. When I think of the FAA test I think of the amateur radio test (also a federal govt test) where you have to learn theories of radio waves and how to construct antennas just to be licensed to talk on certain frequncies. With the FAA test you have to learn weather theories and patterns and all about vors when today everyone clicks on the internet to see the weather and everyone uses GPS to navigate. Sure you need certain skills that are tested but the majority of the test is a waste given todays technology. A $300 GPS is far better than learning old technology just to pass a test when you and even the FAA knows you are not likely to use it.

  4. Randy Russell
    Randy Russell says:

    Test needs updated you still need to learn most of the questions on the test but the test NEEDS updated to match the times and new equipment

  5. ChrisP from NY
    ChrisP from NY says:

    I took the Instrument test when they were making the shift. I fortunately caught wind of the Private and Commercial fiasco and quickly went back to see what material was being added to Instrument test.

    Some of the material added was related to GPS equipment, and material on icing. Prior practice test banks didn’t include much in these areas. I beefed up on those areas and did well on the test. The issue appears to be the addition of subject areas like icing that weren’t on the prior test or not included in the prior practice test banks.

    Some of this is a response to recent safety concerns. One question was related to tail plane icing. This is obviously related to the Colgan Air Flight 3407 accident. Obviously it’s important for some sort of safety or performance issue to make its way into the material. The question is how is that addition disseminated? If the FAA doesn’t share it’s actual questions and answers, nor should they, then how can anyone know to prep for an obscure last second addition?

    It would seem to be a communication issue from the top. FAA wants important topics to be added, but who should they tell to make sure students are covering certain areas? Bulletins to CFI’s? Newsletters? FAA Safety Magazine? Notices to Test Prep companies?

    This is a tough question. If I’m doing my due diligence and using a standard set of materials, ASA textbooks, FAA Instrument Handbook (which had all the info on icing and GPS equipment), then I’ll always do well on the test.

    In the past, students could simply take practice tests and pass. But now there would seem to be a push to force students to study all the material like any other standard test. If the FAA’s “out” to changing the material is that it’s in the related FAA Handbook, then everyone in fairness should be instructed up front that any subject area in the Handbook is fair game for the test. Yes, that would elevate aviation knowledge and safety, but the sudden switch to update the questions should have been communicated properly.

    -ChrisP from NY

  6. Kenn Hinick
    Kenn Hinick says:

    I do believe that many of the test bank questions are obsolete as compared to the actual training being conducted. How do you justify questions on ADF’s to students that have never seen nor will ever fly with an ADF equiped aircraft. The questions should be based on learned knowledge that reflects the actual enviroment in which the aircraft operates. Why should anyone need to know which way you need to turn if the needle is on the blue side of the instrument.

  7. Tim Fountain
    Tim Fountain says:

    One thing that balances the written is the oral as part of a rating. This is where the rubber really meets the road and a good examiner will dig into areas of weakness. The written is just a bar you have to get over, IMHO the oral is a lot more daunting and at least as the test progresses to the flying potion, much more in tune with the equipment the pilot is using for the test.
    Nothing is infallible but the practical test is a good counterpoint to those who try try to cheat the system by blindly memorizing the test questions and answers without understanding the theory behind the correct answer.

  8. Chuck Losinski
    Chuck Losinski says:

    Having just complete the CFI FOI and CFI knowledge and next week the AGI written, all at age 50, I have some recent exposure to current tests. The FOI was very challenging due to subtle wording differences that made answers much less black and white (passed it). The CFI knowledge devoted many questions to obscure regulations where even an experienced CFI would be cracking open the FAR/AIM I do believe (passed that one too). Not sure what to expect on the AGI. I’m thinking the oral and practical will be equally challenging!

  9. Mike Savage
    Mike Savage says:

    On the private, I scored 100. On the instrument, I scored a 95. How did I do it? I memorized the correct answer to each question. I did this late at night while on ocean cruises holed up in a conference room. I did them over and over until I could get an entire section correct.

    Why did I do it? Because some were trick questions where a different verb or adjective in the answer made the difference. Some questions related to old weather maps that I could not read any more. Some of the isogonic lines had since changed on the maps. Many of the approach plate questions had subtle tricks to them (how many initial approach fixes are there to this airport?). Reading and understanding the answer is where I learned the material.

    This is, for some, way too much work for some extra points but I definitely benefitted from it.

    I had a great CFII and did fine on the oral and check ride.

  10. Tom Knights
    Tom Knights says:

    Proposed changes to FAA written exams are long overdue. “Gotcha” type questions that lead to other wrong answers are not clever or helpful to understanding essential aviation concepts that relate to real world flying. The proposed revisions would have happened years ago if not for a creaky, bloated and rule-bound FAA bureaucracy that must continue to add road blocks and complexity to a system which in turn, requires ever more rules and complexity, etc. Common sense test questions will raise confidence levels and encourage new pilots to learn more, and will lead to a safer and more professional pilot population.

  11. Vicki B.
    Vicki B. says:

    In the Aviation Instructors Handbook, the FAA has information on how to formulate good test questions. I am disappointed when I look through the question bank. Many of the questions are exactly the type that is called out as unclear, unfair, and invalid for the subject tested.

    Yes, the questions need to be regularly reviewed for content and also for high rates of incorrect answers. A question that large numbers of people are answering wrong may point to larger issues with the available teaching materials. Without such evaluation, you cannot hope to create a test that actually means something.


    As an aviator and instructor, I can tell you that multiple choice testing is less desirable as a testing tool as compared to essay type questioning or in the case of the practical test – the oral portion of the exam. From a practical standpoint grading of such tests is fast and efficient, allowing applicants to move quickly to the next steps of training. At one time most FAA knowledge exams provided four possible answers as opposed to todays three for most airman exams. More possible answers mean more opportunities for the test giver to examine the applicant’s knowledge and ability to discriminate differences between similar facts and theories for choosing the best answer. It may be that the FAA chose to make the written easier; however authors of new questions for the various question banks may also be tempted to use more ambiguous language or insert newer obscure questions to fit the agency’s “bell grade curve” for a particular time period. Technology exists to scan and grade essay type questions. Perhaps the FAA could explore such technology along with a more comprehensive multiple choice test question construction in order to reduce the memorization syndrome as the key strategy for FAA Knowledge Test taking while better examining the applicant’s knowledge base for safe flight.

Comments are closed.