To determine the psychometric integrity of the 95 questions, two human factors graduate students gave the exam along with a demographic survey to 204 GA pilots of varying skill/certification levels. These included 134 Aeronautical Science students from a southeastern university plus an additional 70 participants attending the 2016 EEA AirVenture Oshkosh, WI.
The purpose of this presentation is to present the preliminary analysis of the results stemming from the 70 Oshkosh participants, which included: 14 student pilots, 26 private pilots, 15 private pilots with instrument, and 15 commercial pilots with instrument. The results indicate that the questions had varying degrees of difficulty, high internal consistency and discriminated between pilots with different skill/certification levels.
Specifically, the initial results indicate that while the exam score means increased with rating, the only statistically significant difference occurred between the commercial certified instrument pilot group and the student pilot group. No statistically significant differences were found between the other certification level groups. Initial results also indicate that all pilot groups scored significantly higher on flight planning and weather source questions, and significantly lower on weather forecast and observation product interpretation questions.
In addition to detailed exam results, the presentation will also discuss some of the challenges experienced while developing the questions, such as product guidance not keeping pace with the products available on-line, as well as suggestions for improving weather guidance for pilots.