Data Management Guide

Size: px
Start display at page:

Download "Data Management Guide"

Transcription

1 Advanced Qualification Program Data Management Guide Produced by the Air Transport Association s Data Management Focus Group Second Release 5/12/98 TWA RPO

2 This document supports the Advanced Qualification Program (AQP) and is intended for reference use only in the collection, reporting, analysis, and overall management of AQP and single-visit training proficiency data. The document was prepared by the Data Management Committee of the Air Transport Association s AQP Working Group and contains recommended information only. An individual carrier s approved AQP documentation and/or single-visit exemption takes precedence over the content of this guide. SFAR 58 and AC contain the approved procedures for developing and administering AQP training curricula. 2

3 Acknowledgements In 1995, the Data Management Focus Group recognized the need to develop a Guide that would enable us to share our knowledge of the many facets of data management within an AQP environment. Teams from the participating airlines considered each of these and prepared guidance, which was given to the Group Chairman, O. J. Treadway, of American Airlines, who produced a Guide that was distributed on May 29, 1996 at the annual meeting of the ATA AQP group in Minneapolis. In 1997, the Group met to consider changes to Advisory Circular Chapter 8, which deals with data management, was prepared by Paul Johnson of FAA National AQP office AFS 230. It contained a wealth of detail and updated information, using the Guide as one resource. The Group felt that this would serve as the basis for a new version of the Guide, and extracted the essence of the chapter that would make it parallel the input of the other groups in the Advisory Circular. In January, 1998, the Group decided to attempt to produce the Guide for distribution at the 1998 annual meeting. With a rather short deadline, it took the efforts of many to help to prepare this Guide. Paul Johnson was, of course, of exceptional help, without which I can say that it could not have been completed. The review team included Dennis Conley, also at AFS 230, Mandy Blackmon (Alaska), Matt Humlie (Delta), Iris MacIntosh and Joy Lanzano (United), and Kevin Sliwinski (Northwest). Their hours on conference calls and very careful study of the draft have yielded a document that I believe reflects the current state of data management within AQP. I must also recognize the help of Professor Bob Holt, of George Mason University, who has shared his passion for data and statistics with the Group through very entertaining and informative presentations at our meetings. He has also prepared inclusions for this Guide, which should serve as a starting point for those who have questions about how to handle their data. Finally, I must thank Capt. Jack Eastman, Director of Training at TWA, for his support and for expediting the lastminute printing of this Guide. Robert P. Odenweller (TWA) Chairman, Data Management Focus Group 12 May 98 i

4 Table of Contents SECTION 1. INTRODUCTION Purpose...1 SECTION 2. DATA MANAGEMENT Definition Individual Qualification Records Performance/Proficiency Data Purpose Costs Related Data Management Projects...4 a. Model AQP Database... 4 b. Inter-Rater Reliability (IRR)... 5 c. Flight Operations Quality Assurance (FOQA)... 5 d. National Transportation Safety Board (NTSB)... 5 SECTION 3. GENERAL DATA CONSIDERATIONS General Characteristics of Data Collection Population for Measurement (Who)...6 a. General Population... 6 b. Sampling Units Items for Measurement (What)...7 a. Item Definition... 7 b. Item Selection Methods of Measurement (How)...8 a. Degree of Objectivity & Subjectivity in Measurements... 8 b. Samples... 9 c. Sampling Methods... 9 d. Aggregating or Dividing Samples SECTION 4. COLLECTING QUALITY DATA General Data Collection vs. Evaluation High Quality Data Attributes...12 a. Sensitivity b. Reliability c. Validity d. Good Data Collection: Standardization and Completeness Use of High- or Low-Quality Data Non-Sampling Errors i

5 SECTION 5. SVT VS. AQP DATA COLLECTION General SVT/AQP Differences SVT/AQP Similarities...18 SECTION 6. DETAILS OF AQP DATA COLLECTION General Data to be Collected Performance Data (TPO/SPO/Event Set) Enabling Data (EO) Demographic Information Crewmember Identification...21 SECTION 7. MEASUREMENT CODES...22 SECTION 8. INSTRUCTOR/EVALUATOR CONSIDERATIONS General Training Rotations Instructor Limits on Data Collection...24 SECTION 9. ACTIVITIES REQUIRING DATA COLLECTION General Phase III (Implementation)...25 a. Existing Instructor/Evaluator Training...25 b. New Instructor/Evaluator Training...25 c. Small Group Tryouts Phases IV & V (Initial & Continuing Operations)...26 a. Training...26 b. Validation...26 c. Line Operational Evaluation (LOE)...26 d. Line Evaluations...27 e. First Look (Continuing Qualification Only)...27 f. Instructor/Evaluator Data Collection Summary...28 ii

6 SECTION 10. DATA ENTRY General Performance/Proficiency Database Database Hardware/Software Data Entry Methods...30 a. Fully Automated b. Manual c. Semi-Automated Data Entry Systems...32 a. Instructor/Operator Station (IOS) b. Onsite Computer Entry c. Optical Mark Reader (OMR) d. Flatbed Scanner SECTION 11. DATA REPORTING General Internal Reporting Report Formats...33 a. Written b. Tables c. Graphs & Charts Reporting Software...34 SECTION 12. PROPOSED AIRLINE DATA REPORTING Report Timing Report Content Detailed Reports...34 a. Red Flags for Detailed Reports Summary Reports...35 a. Red Flags for Summary Reports Special Reports...36 a. Drill-Down Analyses Report Formats Report Process...38 SECTION 13. FAA REPORTING SVTP Data Collection AQP Qualification Curriculum AQP Continuing Qualification Curriculum Data Transmittal to FAA Required Documentation for Format Approval Transmittal Instructions...45 iii

7 SECTION 14. MEANINGFUL DATA ANALYSIS General A Logical Sequence for Meaningful Data Analysis Eight Steps to Meaningful AQP Data Analysis Sample Size Summary...52 APPENDICES A. Terms and Definitions for Items Found in the...a-1 B. Data Entry Summary...B-1 C. Carrier Summary... C D. Sample Data Collection Forms...D E. Sample Database Structures... E iv

8 AQP DATA MANAGEMENT CONSIDERATIONS SECTION 1. INTRODUCTION 1. Purpose This guide updates and replaces information first presented in the (initial release May 29, 1996), developed by the Data Management Focus Group of the AQP Subcommittee, which is sponsored by the Air Transport Association. It provides data management guidance by discussing general data considerations and specific data collection, entry, reporting, and analysis requirements. Law and regulation in Part 121, Part 135, and SFAR 58 define these requirements. It also provides an expansion of the information provided in Advisory Circular A and incorporates data submission guidance set forth in memorandums Streamlining Initiatives (June 20, 1995) and SVE/AQP Performance/Proficiency Data Submission Requirements Change (September 19, 1996) issued by the FAA s Manager of AQP. This guide also includes findings from recent AQP grant research conducted between academia and selected AQP-participating airlines. The principal goal of the AQP is true proficiency-based training and qualification. This proficiency-base (expressed as performance objectives) is systematically developed and maintained, then continuously validated through the collection and evaluation of empirical performance data. Data collection and analysis (data management) is, therefore, an integral part of AQP and of the SVT (Single Visit Training) exemption, which is usually implemented by carriers as a precursor to AQP. SECTION 2. DATA MANAGEMENT 1. Definition An AQP curriculum must include procedures for data collection that will ensure that the certificate holder provides information from its crewmembers, instructors, and evaluators that will enable the FAA to determine whether the training and evaluations are working to accomplish the overall objectives of the curriculum. In addition, it is equally important for the carrier s own personnel (fleet managers, instructors, evaluators, and curriculum development staff) to employ good data management to evaluate the effectiveness of their AQP in meeting its objectives. While airlines are expected to perform more in-depth data management, the FAA s Manager of AQP looks at higher level, global issues across all air carriers. The FAA acknowledges that data between airlines differ in many respects, and that it is neither desirable nor possible to make any meaningful cross-comparison between carriers. Data management is required for all participants within an AQP, including crewmembers, instructors, and evaluators. It is classified into the two broad categories of individual qualification records and performance/proficiency data. 2. Individual Qualification Records An important component of any FAA approved training program is an adequate system of maintaining individual qualification records. Individual qualification records are identifiable records in sufficient detail on each individual who is qualifying or who has qualified under an AQP to show how and when the individual satisfied the requirements of each curriculum. These records may include demographic and work history information, as well as completion information on modules and lessons within each indoctrination, qualification, and/or continuing qualification curriculum for each crewmember, instructor, and evaluator. 1

9 Carriers may maintain a manual record keeping system based on the standard Part 121 or 135 record keeping requirements or may design a computerized record keeping system. Automated methods of collecting information for qualification record keeping are being explored at some carriers. It is important to note that individual qualification record keeping systems are not unique to AQP. It is presumed that all carriers approved for operations have current and acceptable record keeping systems for individual qualifications. Any of these existing records and record keeping systems approved for use under traditional programs that comply with the AQP requirements and are otherwise acceptable to the FAA as meeting Part 121 and 135 requirements may be used and do not need to be duplicated for AQP. 3. Performance/Proficiency Data Performance/proficiency data provide deidentified information on individual performance used in the aggregate to analyze training programs and/or groups of participants, to spot developing trends, and to identify and correct problems that may be noted. Performance data are used to determine long range trends and to support training program validation and improvement initiatives, not for tracking individual accomplishment. Usually, performance/proficiency data are unrelated to and separate from the individual qualification record keeping system. The process of managing performance/proficiency data in an AQP is the primary focus of this Guide and the term data management as used throughout the remainder of this Guide refers to that process. The data management process consists of the four activities shown below; each of which is described in detail in subsequent sections of this guide. Collection of proficiency data Entry of the collected data into a performance/proficiency database Conducting statistically sound analyses on the aggregate performance data Reporting of the data to the appropriate managers and fleet personnel 4. Purpose The principal reason for good data management at any carrier is to establish a systematic process for quality control over pilot training and qualification. Using data management to accomplish the four general activities attains this. General applications include: Using individual qualification records to validate the proficiency of crewmembers, instructors, and evaluators Using aggregate performance proficiency data to: Validate curriculum content and/or indicate needed program changes Establish expectations and performance norms for crewmembers Provide a benchmark for requested and/or needed changes over time as: Specific applications of the information from the data management process can be used in a variety of ways, such Provide assurances of proficiency levels Establish expectations and determine variations from those expectations Assess instructional quality Validate training assumptions 2

10 Analyze effectiveness of instructors and evaluators Provide instructor and evaluator feedback Refine the training and/or measurement process Indicate where training changes are needed Validate alternative training technologies Provide common grounds for sharing of information between carriers Provide a quantitative means for CRM assessment The carrier also may use this information to support a request for a modification of an approved AQP. For example, the carrier may request FAA approval for a three-month extension of the evaluation period in a continuing qualification curriculum (15 months instead of the normal 12 months). The carrier must be able to support its request with statistically valid data that indicate that present crewmember performance warrants the extension, and to confirm its ability to continue to collect valid data that show that performance does not degrade as a result of the extension. 3

11 5. Costs Management of performance/proficiency data requires a commitment by the carrier of specific resources for collecting, entering, reporting, and analyzing the data. The level of resources necessary varies between carriers, but primarily is dependent on three factors. Extent to which the data are used (e.g., Will the data be used for process control, trend analysis, fleet comparisons, training effectiveness, cost control, compliance with regulatory requirements, requesting interval extensions, or some combination?). Quality of the data required for each purpose and objective (Data quality issues are discussed later in the guide.) Size of the carrier (amount of data to be collected). The costs associated with data management are directly related to the number of objectives, the number of comparisons made, the amount of data collected, and the required quality of data. In all likelihood, no carrier will be willing to commit sufficient resources to collect high quality data for all purposes and objectives. However, most carriers may want to collect high quality data for a limited set of specific objectives that are critical in the carrier s Data Analysis Plan. Therefore, before beginning any data collection, each carrier should determine its own objectives and data analysis plan, and then commit to the level of resources required. If necessary, a cost benefit analysis may be advisable before embarking on any serious statistical analyses. 6. Related Data Management Projects The FAA and the aviation industry are conducting a number of related, complementary projects and data collection programs. Of particular interest to the AQP process may be the emerging data management techniques of programs that focus on the next-generation data specifically the collection, archiving, and analytical processes of these programs. Economies of scale and simplicity of operations management may be possible by considering the impact and relationship of these programs to the overall objectives of the airline. a. Model AQP Database The FAA is developing a Model AQP Database for distribution and use by participating airlines as a guide in preparing their AQP at a minimal cost. The program requires automation of the Program Audit Database (PADB) and the Performance/Proficiency Database (PPDB), which can pose a prohibitive cost to smaller carriers. An FAAdeveloped model that can be used or imitated by any carrier will allow the FAA to be responsive in its mission to improve the safety of all air carriers without regard to financial status or size. The first phase of development involved extensive research into the methodology that should be used to develop the curriculum for an AQP-type training program. Originally developed in Paradox, and based upon user input, the existing database was migrated to a Microsoft Access platform. The redesign of the Model AQP Database includes; migrating the previous PADB and incorporating user suggested enhancements, prototyping the PPDB and linking it to the PADB, and developing a data analysis tool for the FAA s AQP office. Carriers will be able to modify the application to meet individual needs. b. Inter-Rater Reliability (IRR) This study describes an innovative and comprehensive training program for improving inter-rater reliability or rater calibration for aviation instructor/evaluators using videotaped flight simulations. The training includes the 4

12 development of individual pilot and group profiles for feedback and discussion. It also offers baselines and suggests potential benchmarks as standards for rater calibration. Pilot performance relies on systematic observation and assessment by trained raters or instructor/evaluators. Therefore, the reliability of the rater or I/E is critical to safety and flight standards. The program addresses the measurement of reliability of the I/E judgments and training that would improve inter-rater reliability. c. Flight Operations Quality Assurance (FOQA) FOQA embodies data collection, management, and operational insight into flight operations and associated quality control processes. As such, some of the development and implementation strategies in the FOQA program may prove helpful to AQP managers. Although the FOQA and AQP databases are completely distinct, AQP can provide tools that can support data collection and analysis as a portion of the carrier s FOQA program. d. National Transportation Safety Board (NTSB) Human factors receive a great deal of emphasis in an NTSB investigation into a major accident. Over the course of time, the NTSB has developed some very reliable methods for analyzing human performance data. Studying and applying some of the data management methods of the NTSB may be beneficial to some operators. This includes development of report forms, interviews, and collection processes, as well as analytical techniques that are designed to elicit detailed information on the performance of the people involved in the piloting of aircraft. 5

13 SECTION 3. GENERAL DATA CONSIDERATIONS 1. General The process of managing data within an AQP or SVTP involves collecting, entering, reporting, and analyzing performance/proficiency information. However, before the data management process can be described in detail, it is important to discuss some general considerations regarding managing the data. This section of the Guide describes the general characteristics of data collection, the quality control aspects of data collection, some considerations for the instructor and evaluator with regard to collecting data, and the differences between management of data for an AQP versus an SVT program. 2. Characteristics of Data Collection Certain characteristics are common to all types of data collection activities and center on three main areas: defining the population to be measured (who), defining what about the population is to be measured (what), and determining the methods of performing the measurements (how). These characteristics of data collection and how they relate to the AQP/SVT data management process are described below. 3. Population for Measurement (Who) a. General Population The population to be measured must be defined in order to use good statistical analysis methods. Even if the population is generally known, it is still necessary to have a precise definition, because all measurements relate only to the defined population from which they were taken. For AQP, the general population for measurement is made up of crewmembers, instructors, and evaluators. b. Sampling Units The specific elements selected from the general population for sampling are called sampling units. Sampling units must be unique, easily identifiable, and selectable. A crewmember occupying a cockpit seat position is both discrete and clearly identifiable. Measurements can easily be associated with the Captain, First Officer, or Flight Engineer. Consequently, in AQP, the usual sampling units are the crewmembers scheduled for qualification and continuing qualification training and line checks within a specified reporting period, usually for one month. In addition, individual scores for each crewmember may be combined for crew-level analyses, in which case the sampling unit is the crew. 4. Items for Measurement (What) After defining the population and the units for sampling, the specific items to be measured must be defined and selected. For SVT data, the measurement items primarily are the Appendix F maneuvers with conditions added. AQP data use a tailored set of Terminal and Supporting Proficiency Objectives (TPOs/SPOs) and event sets. A more detailed description of the specific items for measurement is provided later in the Data Collection section of this Guide. 6

14 a. Item Definition Measured items must have clear definitions, be easily recognizable, have a clearly defined set of observable measurement parameters (qualification standards), and be given a discrete name. During training and validation, measured items normally are the individual TPOs and SPOs. However, during LOFT and LOE, these same measured items may tend naturally to flow together, such as certain items in natural combinations or in circumstances when the boundary between items becomes less distinct. When measured items are presented as a sequence of events (as in LOFT and LOE) each item must have a defined start and stop point. In addition, the instructors and evaluators must have sufficient experience so that they can recognize each measured item as it is accomplished, be able to compare the observed performance against performance standards, and assign the correct grade. For example, consider a fairly typical event set that might consist of a CAT II approach to a landing with one engine out. This event set generally will be graded as a unit, which is acceptable as long as any conclusions that are derived are based on the entire event set. However, this event set also contains three discrete events, a CAT II approach, a landing, and one engine out procedures. If conclusions about the CAT II approach are desired, then items specific to the CAT II approach should be graded separately or a separate CAT II approach event should be created. It is important to note that the definition of events for data collection may not necessarily be the same as for training purposes. For example, an LOE event set used for evaluation can be defined as a set of discretely measured items, but may also have distinct items reflecting specific training goals. Data collected on the event set should be analyzed at the level of the event set for evaluating performance and also at the level of the component training items that are included in the event set for training program feedback. b. Item Selection For any data collection, instructors and evaluators may have the flexibility to select from various aspects of the measured items. While this may be good for training, it is not good for data collection. That is because if different evaluators select different items, the variable mix of items each pilot is tested on will affect the results and make them less reliable. Therefore, for performance evaluation, the selection of items to be measured and the measurement conditions must be defined and fixed before the data collection process starts. Good definitions and consistent administration contributes to high-quality data. For Maneuver Validation, the First Look and Fixed maneuvers are precisely defined evaluation items, while the variation in administering the Variable maneuvers aids training but at the same time makes the ratings less useful for evaluation. Particularly for First Look and Fixed maneuvers, the allowance of variations must be strictly controlled. For example, when evaluators are given a choice of NDB, VOR, and Back-Course Localizer approaches for administering a non-precision approach (NPA), the Back-Course Localizer approaches are almost never performed and the relative mix of NDB and VOR approaches given can differ among evaluators, leading to different evaluation outcomes. This must be prevented by evaluator training or direct assignment of NPA items to ensure a representative evaluation outcomes. For an LOE, the evaluator must know the script thoroughly, because this makes grading easier and more reliable. Further, the evaluator must be trained not to depart from the script except in script variations. This ensures that each crew is receiving, as closely as possible, the same evaluation. The LOE script must be selected to be operationally relevant and developed to be a psychometrically sound evaluation (discussed later in this Guide). Carriers with small fleets may use a single script, with all events and conditions the same for all crewmembers. The advantage is that the process is the same for all pilots and will produce the largest possible sample size for analysis. The disadvantage is that with repeated testing, crews may become familiar with the script. Carriers with sufficiently large fleets that can support multiple samples may use multiple scripts. The script change may be as simple as pilot flying role, good day versus bad day conditions, different permutations of events, etc. Multiple scripts can provide a more complete view of crew parameters and make it more difficult for crews to become familiar with any one script. However, each permutation of scripts divides up the overall sample size into smaller 7

15 aggregates. Even the largest of fleets may not be able to use more than four script permutations. The processes used for forming reconfigurable LOE script variations are currently under study by the AQP office in conjunction with researchers and a major carrier. Carriers should develop an LOE tracking system for each crewmember to prevent the same LOE from being used in consecutive evaluations and to track LOE variations used for each crew if such variations are implemented. 5. Methods of Measurement (How) a. Degree of Objectivity & Subjectivity in Measurements Measurements generally are classed as more objective or subjective, depending on the amount of human judgment involved. More objective measurements require little or no human judgment and are defined in some kind of physical unit (e.g., altitude in feet, speed in knots, heading in degrees, etc.). An example of more objective data is detailed FOQA data. Detailed objective measurements must usually be combined or transformed to arrive at a useful performance evaluation (e.g. the degree of stability on an approach). The original detailed data must be sensitive to performance differences, the combinations or transformations used on this data must be appropriate and done consistently, and the final result must be shown to be a valid indicator of pilot performance. The AQP office is currently supporting this kind of research with FOQA data. For some aspects of pilot performance, objective data may ultimately give sensitive, reliable and valid results. However, for other aspects of performance such as CRM skills, more subjective measurements will probably be necessary. More subjective measurements require that a person evaluate the measured item and assign a score or rating. With subjective measurements each evaluator combines basic observations into a performance evaluation. Evaluators need to be trained to make a sensitive distinction between different performance levels on each item, to combine information consistently into a final assessment, and to assign an appropriate grade. The result of a subjective evaluation process must also be assessed for measurement validity. Recurrent training in evaluation and data collection procedures for evaluators is just as important as recurrent training in flying procedures for the crew. This procedure is similar to the regular checks performed on the calibration of devices that record more objective data, such as altimeters or airspeed indicators. In most instances, AQP and SVT data are measured subjectively. The data consist of repeat counts and event ratings that are dependent on the subjective judgment of the instructor/evaluator. Therefore, it is essential to good data collection that outside biases do not influence the judgment of the evaluator. b. Samples A sample is a subset of the population and/or performance events. Conducting evaluations on a sample allows the parameters of the general population to be estimated from the sample results. As population parameters must be estimated, it is not possible to determine the exact parameters of the entire population. However, the analyst can determine how close these estimates are with a known level of certainty. The certainty of these estimates depends on the sample size and amount of variation in the item or index. The AQP continuing qualification and SVT recurrent training programs require that all crewmembers be evaluated each period and/or cycle. Consequently, AQP and SVT data have some characteristics of a census in that some data will be collected on almost all sampling units in a fleet (crewmembers, instructors, and evaluators). However, it is different from a census in that data are not taken from all sampling units at once but rather monthly samples spread out over the year. In AQP continuing qualification curriculum and SVT, a fraction of the crewmember population is evaluated each month on a subset of events, such that all crewmembers are evaluated within the period (usually 12 months). Each monthly subset can be considered a sample from the parent population. Since the monthly sample size is fixed by fleet size and by the number of fleet pilots requiring evaluation for currency, the estimates from a single month cannot be improved by increasing the monthly sample size. 8

16 c. Sampling Methods The two common sampling methods are random and representative. In a random sample, all sampling units in the population are equally likely to be selected and there are NO systemic criteria associated with the selection method. This makes the sample a true reflection from the parent population, only smaller. In a representative sample, the sampling units are selected by some systematic criteria other than equal likelihood. The pilot selection method for AQP and SVT is the set of crewmembers that need to receive continuing qualification or recurrent training this month. Such a sample is considered to be representative of the parent population, but is not truly random. For example, due to intense waves of hiring pilots for a fleet, a bolus or clustering of pilots with certain characteristics may predominate in a given month. This bias in the monthly sample could conceivably influence the results. However, if there is no strong, systematic shift in pilot samples across sampling periods, the samples may be representative enough for statistical use. A second bias to be avoided in the selection method is any systematic assignment of evaluators to pilots and crews. Evaluators must NOT be allowed to choose the crews they evaluate. Further, the scheduling of evaluators and crews for an assessment should be examined to ensure that it depends only on availability and NOT on any other systematic variable. For this purpose, computerized assignment of evaluators to crews is clearly preferable to clerical assignment. A third bias to avoid is any systematic shift in the measured items over months measured items must be the same for all monthly samples. Any systematic shift that would result in a change in which items are measured in certain months (e.g. a shift in the relative proportion of NDB vs. VOR Non-Precision Approaches) can bias the monthly results. d. Aggregating or Dividing Samples A characteristic of random samples is that any division or aggregation of a random sample is also a random sample. We make use of this property when we divide the sample by aircraft, crewmember position, maneuver, etc. to obtain more detailed statistical information. The sample can be subdivided safely into various subclasses as long as the resulting subsamples satisfy the minimum sample size restriction for doing statistical analyses. It is also possible to combine two or more monthly samples to create a sufficiently large sample to measure interactions among critical determinants of performance. For example, the interaction of aircraft by seat (Captain, First Officer) by maneuver on maneuver validation may require aggregation over several months to ensure sufficient data. Aggregating monthly samples into yearly results will give a sample that is very close to a complete census for the fleet and is even more representative of the fleet population than any monthly sample. Therefore, once-a-year analyses may be performed at any natural juncture of the data collection process, such as the switch to a new LOE for the fleet. 9

17 SECTION 4. COLLECTING QUALITY DATA 1. General The collection of quality data for an accurate representation of the population requires capturing a high quality of data and a low level of errors in the data. Sample sizes are discussed in the Data Analysis section of this Guide. 2. Data Collection vs. Evaluation Several conflicts exist between the requirements for AQP/SVT evaluation data collection and crewmember training. These conflicts are centered on the rigidly controlled environment required for collecting good quality evaluation data and the more loosely structured environment required for good quality crewmember training. Some events like LOFT clearly emphasize training at the expense of evaluation. Other events like the LOE emphasize evaluation during the critical simulator portion while training is relegated to prebriefs, debrief, SPOT checks, and so forth. Carriers should always keep in mind that, while good quality data is important, the primary purpose of AQP and SVT is better training. Never let the data collection process overwhelm the training objective. If push comes to shove, the training goal should dominate and modifications in the evaluation procedure should be noted in the data collection process (e.g. noted on the form). In this way, training is not jeopardized while at the same time the evaluation data can be cleaned of non-standard evaluation sessions so that the descriptive and inferential statistics are accurate. A good training program should permit some instructor discretion. It may also allow the training program to be tailored individually to the crew being trained. The training scenarios and syllabi are commonly used as guides for what must be accomplished during the training period, rather than as rigid scripts. However, this kind of discretion introduces errors into the data that may, at times, make them less reliable. This may occur for training evaluations such as LOFT. However, it is also the case that good training depends on accurate evaluation of a particular pilot s strengths and weaknesses in performance. Therefore, accurate evaluation is also an important part of training sessions such as LOFT. If evaluation quality is basically preserved, data can be captured from the more relaxed training environment and still provide useful information. High quality evaluation data can be collected for a specific subset of all data, such as the First Look section of Maneuver Validation and the simulator portion of the LOE. For these subsets, the evaluation goal should have the same priority as training. This limited set of high quality data should be collected with sufficient attention to detail such that performance will be evaluated reliably. In fact, the initial analysis by the FAA of SVT data from several carriers suggests that data from a subset of more carefully controlled events will produce better statistical results than data collected during a more general training session. The First Look data usually have only a limited set of specifically defined maneuvers and conditions that are administered to both the Pilot in Command (PIC) and the Second in Command (SIC). A statistical review of the data has shown a very marked distinction in the quality of these maneuvers as compared with data gathered during the more general training sessions. 3. High Quality Data Attributes Four primary attributes determine the suitability of data for analysis that will yield meaningful results. The greater the attention to these attributes, the higher the quality and the more useful the collected data become. The main determinant in the quality of data is in the preservation of these data attributes. a. Sensitivity 10

18 Sensitivity means that small gradations or variations in the parameter being measured, such as maintaining heading in a maneuver, are reflected in some variation in the measurement, such as the scale rating of the pilot performance. For a single item, a multiple-point scale will allow more sensitivity in measurement than pass/fail grading. Alternatively, using multiple items to measure individual components of performance and then combining those scores into a composite score or index could increase sensitivity. b. Reliability Reliability means a lack of random error in the final measurement, which is usually indicated by consistency among items or stability in the measurements over time. Reliability is affected by both sampling and non-sampling errors. Low Inter-Rater Reliability is an example of a problem that could cause performance estimates to be unreliable. c. Validity Validity means that the data accurately measure what they are intended to measure. Data validity depends partly on having adequate sensitivity and reliability of the measures, but even with adequate sensitivity and reliability the data may be measuring something other than what is intended. For example, a poorly worded grading scale may encourage grading on the evaluator s liking of a pilot s style rather than the pilot s objective performance. Data validity is weakened or reduced by vague data definitions, insufficient evaluator training, casual data collection methods, operator discretion, etc. d. Good Data Collection: Standardization and Completeness Standardization means the data collection procedures are uniformly followed. Consistency ensures that data differences do not come from procedural differences. The standardization attribute is very sensitive to non-sampling errors associated with instructor/evaluator training and the stability of established procedures. Completeness means that all the expected data elements, records, observations, etc. are present. Completeness is affected primarily by nonsampling errors. Common causes of incomplete data are: failing to record a maneuver, recording only part of the required information, losing session records before entering data, data collection documents becoming separated, inability to read/scan the data collection form, corrupted files, etc. Data collection must satisfy the following criteria for the measurement to be of high quality: The events to be measured must be pre-selected by some process. This could be a fixed set, random, representative, combination, or other statistically defensible method. Selection of events to be validated/evaluated can not be at instructor discretion. The measured events must be well defined with discrete performance, conditions, and standards. The events to be measured must be separate and distinct. Coupled events (e.g., Non-Precision Approach to a Missed Approach) must be rated and analyzed as one contiguous event, or the individual components must be rated separately if they are to be analyzed separately. The instructors/evaluators must be required to make an explicit grading of all ratings, repeats, reason codes, etc. Rating by exception implies an answer and has the potential for omissions. Instructors and evaluators who accomplish the data collection must be trained to explicit standards that are well defined. Data on the measured events must be collected in a session that has been set aside discretely for validation purposes. Training should not be accomplished until the validation data collection is complete. (e.g., First Look should not be intermixed with training, etc.) Note that the validation session may be the initial part of a regularly scheduled session, rather than one that stands alone. Devices (flight simulators) used for event measuring sessions must be of similar fidelity. For data purposes, similar devices are considered to be levels 6/7, A/B, or C/D. For example, if data from 11

19 some First Look sessions are captured in a level C simulator, then all First Look data should be taken in a level C or D simulator and not mixed with data taken from a lower level device. The rating scale must be well defined, observable, and consistent. (i.e., Two instructors observing the same performance should be able to mark the same rating with the same reason codes.) The volume of data to be collected should be limited to avoid instructor overload. The procedures for handling and entering the collected data should be well defined to avoid loss of information. No deviations from prescribed procedures should be permitted. As can be seen from the above list of requirements, data collection for inferential statistics is serious business and attention must be paid to the details. 4. Use of High- or Low-Quality Data Collected data can be used to estimate aspects of the population such as an average (descriptive uses) or to answer specific statistical questions (inferential uses). Data that have been collected with emphasis on preserving the above four data attributes can be used to make estimates of population parameters, such as average performance, at relatively higher levels of confidence. Similarly, there is more confidence in answering statistical questions with these high-quality data. Data that have not been collected with regard to preserving these attributes give fuzzier and possibly misleading results when used for descriptive or inferential purposes. Clearly, higher-quality data are preferable to lower-quality data for all purposes. The primary difference between high and low quality data from a management perspective is in the cost and complexity of the data collection. High quality data are more rigorous, costly, and time consuming to collect. Consequently, high quality data collection may occur only where necessary on a limited subset of events and/or validation sessions. Clearly, evaluations such as First Look maneuvers and LOEs require high quality data collection. Lower quality data collection is acceptable for non-evaluative measurements (e.g. LOFT), and measurements that are simple adjuncts to the training process. In addition, carriers should be guided by the following uses for higher and lower quality data. Inferential quality data should be used in AQP for: Extending the continuing qualification cycle Reducing or eliminating individual events from training or checking requirements Descriptive quality data may be used in AQP/SVT for: Reducing or extending the curriculum footprint Changing the set of fixed and/or variable maneuvers (or events) Validating currency items 5. Non-Sampling Errors Non-sampling errors are errors that are embedded in the data collection and analytical process rather than the process of sampling pilots. These errors are associated with the process or procedures used to collect, process, and analyze data. These errors may be systematic or non-systematic. Systematic errors are ones that follow a pattern or are controlled by a function that may bias the observations in a specific direction. Non-systematic errors are ones that occur at random. 12

20 Systematic errors will bias any statistical analysis and produce misleading results. For example, if evaluators shift their use of a rating scale so that they use higher grades for the same performance, the data analysis may indicate a significant increase in performance, which would be highly misleading. Systematic errors are not affected by sample size and they cannot be corrected by repeated sampling. Systematic errors must be recognized and corrected. Non-systematic errors add noise to the data. If this noise is not controlled, inferential analyses can give indefinite results in which the answers to questions can be hidden. For descriptive analyses, non-systematic errors can cause more unpredictable and erratic estimates. Non-systematic errors should be minimized or avoided. The causes of non-sampling errors are many and varied, but generally fall into the relatively broad categories of data definition, procedure, measurement, and data handling. Examples of non-sampling errors are: Improperly reporting the required data collection items (e.g., Not reporting incidences of repeats, missing reason codes, not recording items, recording items incorrectly, improperly marked bubble sheets that cause the scanner to miss the intended mark, recording the last rather than the first occurrence of a measured item during a First Look session, etc.) Data handling errors (e.g., losing forms, separating simulator day 1 and day 2 forms, data entry or coding errors, etc.). There may always be some invalid entries, but data entry procedures can be constructed in a way that will screen the input for validity and help to reduce these errors. Allowing events for measurement to be selected at the discretion of the instructor/evaluator (e.g., requiring a Non-Precision Approach to be accomplished, but allowing the instructor to select which Non-Precision Approach) Data definition errors (e.g., Overlapping or inconsistent data definitions are caused by ambiguities in describing the data to be collected. The items being measured must be clearly identified, otherwise the data collector, data analyst, and data user may be considering dissimilar items. In effect, they talk past each other. For example, consider two maneuvers defined as Non-Precision Approach to a Missed Approach and Non-Precision Approach to a Landing. Depending on the viewer, these could be considered as two, three, or four different maneuvers two Non-Precision Approaches, one Missed Approach, and/or one Landing. If the component parts are to be considered a single maneuver and a repeat is required, is it because of the Approach, the Missed Approach, or the Landing?) Evaluator grading errors (e.g., AQP and SVT measurements are judgment calls by instructors and evaluators. Subjective measurements of performance, poorly defined measurement standards, and lack of training on the use of subjective grading criteria can cause such errors.) Lack of standardization among instructors and evaluators (e.g., The changing mix of instructors and evaluators may affect opportunity to provide standardization training for instructors and evaluators, which in turn can cause the performance assessment process to become somewhat like having a rubber ruler. ) (Refer to the discussion on Instructor/Evaluator Considerations later in the Guide.) Data collection procedures and policies that favor one response over another (e.g., The use of exception reporting and default responses.) This virtually guarantees that adjacent ratings will be under-reported. Problems with data collection forms (e.g., poor or complex form design, using incorrect or obsolete forms, etc.) Reporter fatigue (e.g., Long forms, numerous items, and detailed responses can cause the instructors and evaluators to rush through the forms without proper consideration.) Errors in the computational algorithms used for sorting, reporting, and analyzing the data. 13

21 SECTION 5. SVT VS. AQP DATA COLLECTION 1. General AQP data collection can be split into two broad phases. The first phase is data collection during the transition from the traditional recurrent training program to the actual AQP. The second phase is data collection associated with the approved AQP itself. Collecting SVT data is essentially a data collection process superimposed over the traditional recurrent training and line check program. Data are collected for all maneuvers or items performed or checked and consist of grades and/or repeats to proficiency with associated reason codes. AQP data collection, on the other hand, demands more rigorous attention and is required in all AQP curricula. However, good data management is required for both the AQP and the SVT. The differences and similarities between data management for the AQP and SVT are described in the following table. Keep in mind, however, that the AQP data management requirements replace the SVT requirements for a specific fleet as soon as the AQP is approved for that fleet. Therefore, this Guide focuses primarily on AQP data management. 14

22 2. SVT/AQP Differences SVT/AQP Differences RequirementSVTAQP Data Management ProceduresSimilar for all carriers and are explicitly defined by FAA in each carrier s exemption. Established by SFAR 58 and recommended procedures established in this Guide. Ultimately, defined and agreed on between the individual carrier and FAA Manager of AQP, and documented in carriers AQP plans. Measured EventsDefined by maneuvers.derived from tasks/subtasks in front-end analysis and TPOs/SPOs in curriculum design. Performance, Conditions, and Standards Based on published practical test standards. Does not allow for variation from practical test standards for licensing purposes. Reason CodesRequires reason codes to be reported for substandard performance on measured events. No explicit requirement to report CRM data to FAA. Data ReportingRequired only for crewmembers in recurrent training (continuing qualification). Instructor/Evaluator TrainingDoes not require instructor and evaluator training beyond the traditional regulations. First LookRequirements highly abbreviated (limited to a core set of maneuvers) and proficiency data may be obtained during normal training rather than in an explicitly convened first look session. There is no requirement to assess proficiency on currency items or to use first look data as a basis for considering the interval between training and checking. Allow deviation from practical test guide and are based on the Qual Standards document. Requires identification of proficiency objective standards that were not met, including CRM standards. Required for all participants (crewmembers, instructors, and evaluators) in all curricula (qualification, and continuing qualification). Requires that instructor and evaluator curricula (indoctrination, qualification, and continuing qualification) are designed and the application of measurement procedures is incorporated. First Look data are defined by criticality and currency analysis and may be collected in any CQP simulator period. LOFT DataUsually contains recurrent LOFT, but no LOE.Contains both LOFT and LOE. Data are not required to be collected or reported for the LOFT. Line Check DataObtained only on PIC, rather, than the entire crew, and there is no requirement to assess currency item Required on a crew basis for both proficiency validation and currency assessment. proficiency during line checks. Fleet DataRequired on all fleets.required only in fleets of aircraft that have transitioned to AQP. Data UsesNot used for program validation purposes, but as a control tool to assure continuing safety of SVT. Employed to validate AQP approach to training system development, to determine future disposition of CRM for pass/fail purposes, and to provide empirical basis for rule making on future disposition of AQP. LOENot required.required for AQP Qual and CQP overall grade and event set grade. Validation/EvaluationFailures, repeat counts, and reason codes.comments or reason codes, overall period grades, comments or reason codes for unsatisfactory performance. 15

Date: 6/23/06 Initiated by: AFS-230

Date: 6/23/06 Initiated by: AFS-230 Advisory Circular Subject: ADVANCED QUALIFICATION PROGRAM Date: 6/23/06 Initiated by: AFS-230 AC No: 120-54A 1. PURPOSE. This advisory circular (AC) provides the Federal Aviation Administration (FAA) guidance

More information

Title & Image NATIONAL CIVIL AVIATION ADMINSTRATION. Advisory Circular

Title & Image NATIONAL CIVIL AVIATION ADMINSTRATION. Advisory Circular Title & Image NATIONAL CIVIL AVIATION ADMINSTRATION Advisory Circular Subject: CREW RESOURCE MANAGEMENT TRAINING PROGRAMME Issuing Office: [Identity of Office of NCAA issuing ] Document No.: [NCAA AC #]

More information

Subject: CREW RESOURCE MANAGEMENT TRAINING PROGRAMME

Subject: CREW RESOURCE MANAGEMENT TRAINING PROGRAMME ADVISORY CIRCULAR FOR AIR OPERATORS Subject: CREW RESOURCE MANAGEMENT TRAINING PROGRAMME Date: xx.xx.2013 Initiated By: Republic of Korea AC No: 1. PURPOSE This advisory circular (AC) presents guidelines

More information

Ministry of Civil Aviation Egyptian Advisory Circular Number 00-3 Egyptian Civil Aviation Authority. EAC No.00_3

Ministry of Civil Aviation Egyptian Advisory Circular Number 00-3 Egyptian Civil Aviation Authority. EAC No.00_3 Ministry of Civil Aviation Egyptian Advisory Circular Number 00-3 EAC No.00_3 Issue 5, Rev. 0 Dated May, 2012 Page 1 2B 1T11T Egyptian Advisory Circular Number 00-3 Ministry of Civil Aviation UTABLE of

More information

Developing Advanced Crew Resource Management (ACRM) Training: A Training Manual

Developing Advanced Crew Resource Management (ACRM) Training: A Training Manual Developing Advanced Crew Resource Management (ACRM) Training: A Training Manual BAGASOO August, 2009 COSCAP-BAG Office of the Technical Advisor for Human Factors, Abuja, Nigeria This Page Intentionally

More information

o Il'S~DU. Director, Flight Standards Service (AFS-l) SSISt t Chief Counsel for International Law, Legislation and Regulations (AG~ 0)

o Il'S~DU. Director, Flight Standards Service (AFS-l) SSISt t Chief Counsel for International Law, Legislation and Regulations (AG~ 0) Memorandum Date: APR 29 2014 Federal Aviation Administration To: From: Prepared by: Subject: o Il'S~DU. Director, Flight Standards Service (AFS-l) ~ F 1/(~ ark."bury, ~ SSISt t Chief Counsel for International

More information

Testimony of. Peter M. Bowler. President and CEO. American Eagle Airlines, Inc. Fort Worth, Texas

Testimony of. Peter M. Bowler. President and CEO. American Eagle Airlines, Inc. Fort Worth, Texas Testimony of Peter M. Bowler President and CEO American Eagle Airlines, Inc. Fort Worth, Texas Senate Committee on Commerce, Science and Transportation Subcommittee on Aviation Operations, Safety, and

More information

TRAINING PROGRAM APPROVAL PROCESS FOR APPROVED MAINTENANCE ORGANISATIONS (AMOs)

TRAINING PROGRAM APPROVAL PROCESS FOR APPROVED MAINTENANCE ORGANISATIONS (AMOs) AIRWORTHINESS CIVIL AVIATION AUTHORITY OF BOTSWANA ADVISORY CIRCULAR CAAB Document AAC-020 TRAINING PROGRAM APPROVAL PROCESS FOR APPROVED MAINTENANCE ORGANISATIONS (AMOs) AAC-020 Revision: Original 07

More information

DRAFT. Advisory Circular. AC 121A-09(0) December 2003 HUMAN FACTORS AND CREW RESOURCE MANAGEMENT TRAINING

DRAFT. Advisory Circular. AC 121A-09(0) December 2003 HUMAN FACTORS AND CREW RESOURCE MANAGEMENT TRAINING Advisory Circular AC 121A-09(0) December 2003 HUMAN FACTORS AND CREW RESOURCE MANAGEMENT TRAINING CONTENTS 1. References 2. Purpose 3. Status of this AC 4. Introduction 5. Definitions 6. CRM Training Course

More information

General... 1. Applicability... 1 Cancellation... 1 References... 1 Effective Date... 1 Background...1. Components of CRM... 3

General... 1. Applicability... 1 Cancellation... 1 References... 1 Effective Date... 1 Background...1. Components of CRM... 3 AC AOC-34( (0) 17 April 2014 Advisory Circular TRAINING PROGRAMME ON CREW RESOURCE MANAGEMENT General... 1 Purpose... 1 Applicability... 1 Cancellation... 1 References... 1 Effective Date... 1 Background....1

More information

Idea Search. The FAA seeks comments on proposals to enhance training and certification requirements for airline pilots. BY LINDA WERFELMAN

Idea Search. The FAA seeks comments on proposals to enhance training and certification requirements for airline pilots. BY LINDA WERFELMAN Idea Search The FAA seeks comments on proposals to enhance training and certification requirements for airline pilots. BY LINDA WERFELMAN In the aftermath of the fatal February 2009 crash of a Colgan Air

More information

Advisory Circular. d. Special Federal Aviation Regulation 58 (SFAR 58), Advanced Qualification Program.

Advisory Circular. d. Special Federal Aviation Regulation 58 (SFAR 58), Advanced Qualification Program. Q U.S. Department of Transportation Federal Aviation Administration Advisory Circular Subject: LINE OPERATIONAL Date: 9/27/04 AC No: 120-35C SIMULATIONS: LINE ORIENTED Initiated By: AFS-230 Change: FLIGHT

More information

April 28, 2012 IN SUPPORT WITH COMMENTS. To Whom it May Concern:

April 28, 2012 IN SUPPORT WITH COMMENTS. To Whom it May Concern: April 28, 2012 Docket Operations, M 30 U.S. Department of Transportation 1200 New Jersey Avenue SE Room W12 140 West Building Ground Floor Washington, D.C. 20590 0001 Subject: ALPA Comments to Notice of

More information

Subject: Safety Management Systems for Date: 1/8/15 AC No: 120-92B Aviation Service Providers Initiated by: AFS-900 Change:

Subject: Safety Management Systems for Date: 1/8/15 AC No: 120-92B Aviation Service Providers Initiated by: AFS-900 Change: U.S. Department of Transportation Federal Aviation Administration Advisory Circular Subject: Safety Management Systems for Date: 1/8/15 AC No: 120-92B Aviation Service Providers Initiated by: AFS-900 Change:

More information

Ministry of Civil Aviation Egyptian Civil Aviation Authority. EAC No. 00_6. Issue 5, Rev. 0 Dated May, 2012 Page 1

Ministry of Civil Aviation Egyptian Civil Aviation Authority. EAC No. 00_6. Issue 5, Rev. 0 Dated May, 2012 Page 1 EAC No. 00_6 Issue 5, Rev. 0 Dated May, 2012 Page 1 1B 2B 3B 4B Ministry of Civil Aviation UTABLE of CONTENTS 0BITEM 1TEAC 00-61T 1TSECTION 11T 1TSECTION 21T 1TSECTION 31T 1TSECTION 41T TITLE URECORD KEEPING

More information

CANADIAN AVIATION REGULATION ADVISORY COUNCIL (CARAC) NOTICE OF PROPOSED AMENDMENT (NPA) CREW RESOURCE MANAGEMENT

CANADIAN AVIATION REGULATION ADVISORY COUNCIL (CARAC) NOTICE OF PROPOSED AMENDMENT (NPA) CREW RESOURCE MANAGEMENT EXECUTIVE SUMMARY CARAC ACTIVITY REPORTING NOTICE #: 2014-021 Contemporary Crew Resource Management (CRM) concepts and training programs have been proven by aviation human factors experts to be effective

More information

Industry Services Quality Management System

Industry Services Quality Management System Industry Services Quality Management System Canadian Grain Commission Audit & Evaluation Services Final report March, 2012 Table of contents 1.0 Executive summary...2 Authority for audit... 2 Background...

More information

Date: 1/12/06 Initiated by: AFS-220 AAM-210

Date: 1/12/06 Initiated by: AFS-220 AAM-210 Advisory Circular Subject: EMERGENCY MEDICAL EQUIPMENT TRAINING Date: 1/12/06 Initiated by: AFS-220 AAM-210 1. What is the purpose of this advisory circular (AC)? AC No: 121-34B This AC provides guidance

More information

A Framework for Understanding Crew Performance Assessment Issues

A Framework for Understanding Crew Performance Assessment Issues THE INTERNATIONAL JOURNAL OF AVIATION PSYCHOLOGY, 12(3), 205 222 Copyright 2002, Lawrence Erlbaum Associates, Inc. FORMAL PAPERS A Framework for Understanding Crew Performance Assessment Issues David P.

More information

USDA/APHIS/WS Safety Review. 3.1 Aviation

USDA/APHIS/WS Safety Review. 3.1 Aviation 3.1 Aviation Safety Initiatives in Place Prior to Review The current Aviation Operations and Safety program began with the Aviation Safety and Operations Review of 1998, initiated because of a series of

More information

ICAO Language Proficiency in Ab-Initio Flight Training. Ms. Angela C. Albritton Aviation English Consultant acalbritton@yahoo.com.

ICAO Language Proficiency in Ab-Initio Flight Training. Ms. Angela C. Albritton Aviation English Consultant acalbritton@yahoo.com. ICAO Language Proficiency in Ab-Initio Flight Training Ms. Angela C. Albritton Aviation English Consultant acalbritton@yahoo.com Brief Summary Ab-initio cadets are selected using rigorous selection processes.

More information

A PLANNING MODEL FOR ABET ENGINEERING CRITERIA 2000

A PLANNING MODEL FOR ABET ENGINEERING CRITERIA 2000 A PLANNING MODEL FOR ABET ENGINEERING CRITERIA 2000 M. Dayne Aldridge and Larry Benefield College of Engineering Auburn University, AL 36849 Introduction ABET Engineering Criteria 2000 provides a new basis

More information

Repair Station Training Program

Repair Station Training Program AC 145 RSTP DATE: Repair Station Training Program Initiated by: AFS-300 TABLE OF CONTENTS CHAPTER 1. INTRODUCTION... 1 100. Purpose of this advisory circular (AC).... 1 101. Who should use this AC....

More information

Position Classification Standard for Management and Program Clerical and Assistance Series, GS-0344

Position Classification Standard for Management and Program Clerical and Assistance Series, GS-0344 Position Classification Standard for Management and Program Clerical and Assistance Series, GS-0344 Table of Contents SERIES DEFINITION... 2 EXCLUSIONS... 2 OCCUPATIONAL INFORMATION... 3 TITLES... 6 EVALUATING

More information

Airline Pilots Perceptions of and Experiences in Crew Resource Management (CRM) Training

Airline Pilots Perceptions of and Experiences in Crew Resource Management (CRM) Training Copyright 2001 Society of Automotive Engineers, Inc. 2002-01-2963 Airline Pilots Perceptions of and Experiences in Crew Resource Management (CRM) Training J. Matthew Beaubien and David P. Baker American

More information

DEPARTMENT OF DEFENSE COMMERCIAL AIR TRANSPORTATION QUALITY AND SAFETY REQUIREMENTS INTRODUCTION

DEPARTMENT OF DEFENSE COMMERCIAL AIR TRANSPORTATION QUALITY AND SAFETY REQUIREMENTS INTRODUCTION DEPARTMENT OF DEFENSE COMMERCIAL AIR TRANSPORTATION QUALITY AND SAFETY REQUIREMENTS INTRODUCTION The Department of Defense (DOD), as a customer of airlift services, expects an air carrier or operator soliciting

More information

Methods, techniques, and practices contained in manufacturer s Instructions for Continued Airworthiness (ICA),

Methods, techniques, and practices contained in manufacturer s Instructions for Continued Airworthiness (ICA), Subject: MAINTENANCE AND ALTERATION Date: 10/7/02 AC : 120-77 DATA Initiated by: AFS-300 Change: 1. PURPOSE. This advisory circular (AC) provides one means, but not the only means, of ensuring that the

More information

GAO. HUMAN FACTORS FAA s Guidance and Oversight of Pilot Crew Resource Management Training Can Be Improved. Report to Congressional Requesters

GAO. HUMAN FACTORS FAA s Guidance and Oversight of Pilot Crew Resource Management Training Can Be Improved. Report to Congressional Requesters GAO United States General Accounting Office Report to Congressional Requesters November 1997 HUMAN FACTORS FAA s Guidance and Oversight of Pilot Crew Resource Management Training Can Be Improved GAO/RCED-98-7

More information

Massachusetts Department of Elementary and Secondary Education. Professional Development Self- Assessment Guidebook

Massachusetts Department of Elementary and Secondary Education. Professional Development Self- Assessment Guidebook Massachusetts Department of Elementary and Secondary Education Professional Development Self- Assessment Guidebook For Teacher Professional Development Offerings Modified for use by the District and School

More information

Haulsey Engineering, Inc. Quality Management System (QMS) Table of Contents

Haulsey Engineering, Inc. Quality Management System (QMS) Table of Contents Haulsey Engineering, Inc. Quality Management System (QMS) Table of Contents 1.0 Introduction 1.1 Quality Management Policy and Practices 2.0 Quality System Components 2.1 Quality Management Plans 2.2 Quality

More information

MODELS OF THREAT, ERROR, AND CRM IN FLIGHT OPERATIONS

MODELS OF THREAT, ERROR, AND CRM IN FLIGHT OPERATIONS MODELS OF THREAT, ERROR, AND CRM IN FLIGHT OPERATIONS Robert L. Helmreich 1, James R. Klinect, & John A. Wilhelm University of Texas Team Research Project The University of Texas at Austin Department of

More information

FLIGHT TRAINING DEVICES

FLIGHT TRAINING DEVICES Advisory Circular AC 60-4(0) APRIL 2003 FLIGHT TRAINING DEVICES CONTENTS References 2. Purpose 3 Status of this AC 4. Introduction 2 5. Initial Evaluations 2 6. Recurrent Evaluations 4 7. Evaluation Team

More information

Guide to Using Results

Guide to Using Results Guide to Using Results Contact Information For assistance, call ACT Engage Customer Service at 319.337.1893, 8:30 a.m. 5:00 p.m., central time, Monday through Friday, or email engage@act.org. Resources

More information

ADVISORY CIRCULAR AC-18-001

ADVISORY CIRCULAR AC-18-001 THE BAHAMAS CIVIL AVIATION DEPARTMENT ADVISORY CIRCULAR AC-18-001 PROCESS & APPLICATION: TRANSPORTATION OF DANGEROUS GOODS BY AIR SECTION 1 GENERAL 1.1 PURPOSE This Advisory Circular (AC) provides specific

More information

Crew Resource Management

Crew Resource Management Crew Resource Management DR TIMOTHY BRAKE, SENIOR MEDICAL OFFICER, UNITED CHRISTIAN HOSPITAL HONORARY SECRETARY HKSSIH Crew Resource Management 1 Crew Resource Management I am not an expert in CRM CRM

More information

Subject: Repair Station Training Program Date: 7/5/12 AC No: 145-10 Initiated by: AFS-300 Change: 1

Subject: Repair Station Training Program Date: 7/5/12 AC No: 145-10 Initiated by: AFS-300 Change: 1 U.S. Department of Transportation Federal Aviation Administration Advisory Circular Subject: Repair Station Training Program Date: 7/5/12 AC No: 145-10 Initiated by: AFS-300 Change: 1 1. PURPOSE. This

More information

Airline Transport Pilot Certification Training Program Approval Job Aid

Airline Transport Pilot Certification Training Program Approval Job Aid Airline Transpt Pilot Certification Training Program Approval Job Aid Course Provider Name Course Provider Operating Certificate Part 121 Part 135 Part 141 Part 142 Instructions: Pri to reviewing an Airline

More information

Maryland State Firemen s Association Executive Committee Meeting December 5, 2009

Maryland State Firemen s Association Executive Committee Meeting December 5, 2009 Maryland State Firemen s Association Executive Committee Meeting December 5, 2009 Maryland State Police Aviation Command Update Presented by: Major Andrew J. (A. J.) McAndrew Hello, my name is Major A.

More information

March 21, 2011. Dear Ranking Member Costello:

March 21, 2011. Dear Ranking Member Costello: U.S. Department of The Inspector General Office of Inspector General Transportation Washington, DC 20590 Office of the Secretary of Transportation March 21, 2011 The Honorable Jerry F. Costello Ranking

More information

TIER II STANDARD FOR FINANCIAL MANAGEMENT SPECIALISTS

TIER II STANDARD FOR FINANCIAL MANAGEMENT SPECIALISTS Job Classification Manual Page 1 of 60 TIER II STANDARD FOR FINANCIAL MANAGEMENT SPECIALISTS INTRODUCTION 1. This grade level standard illustrates the application of the ICSC Master Standard (Tier I) to

More information

California State University, Stanislaus Business Administration (MBA) ASSESSMENT REPORT 2008 09 and PLAN 2009 2010

California State University, Stanislaus Business Administration (MBA) ASSESSMENT REPORT 2008 09 and PLAN 2009 2010 California State University, Stanislaus Business Administration (MBA) ASSESSMENT REPORT 2008 09 and PLAN 2009 2010 I. OVERVIEW The MBA Program in the College of Business Administration reaffirms its commitment

More information

L-3 Army Fleet Support is an Equal Opportunity Employer We encourage minorities, women, protected veterans, and disabled individuals to apply.

L-3 Army Fleet Support is an Equal Opportunity Employer We encourage minorities, women, protected veterans, and disabled individuals to apply. L-3 Army Fleet Support is an Equal Opportunity Employer We encourage minorities, women, protected veterans, and disabled individuals to apply. L-3 Army Fleet Support Fort Rucker, Alabama JOB ANNOUNCEMENT

More information

Computer Science Graduate Degree Requirements

Computer Science Graduate Degree Requirements Computer Science Graduate Degree Requirements Duke University Department of Computer Science 1 Introduction 2 General Requirements This document defines the requirements set forth by the Department of

More information

2014 NIFA CRM Contestant Briefing Guide San Diego, California

2014 NIFA CRM Contestant Briefing Guide San Diego, California 2014 NIFA CRM Contestant Briefing Guide San Diego, California Region 2 SAFECON 2014 November 12 15 This document supports the 2014 NIFA Collegiate Cockpit Resource Management Simulation and is not for

More information

Advisory Circular. Subject: DAMAGE TOLERANCE INSPECTIONS FOR REPAIRS AND ALTERATIONS. Date: 11/20/07 Initiated by: ANM-100.

Advisory Circular. Subject: DAMAGE TOLERANCE INSPECTIONS FOR REPAIRS AND ALTERATIONS. Date: 11/20/07 Initiated by: ANM-100. U.S. Department of Transportation Federal Aviation Administration Advisory Circular Subject: DAMAGE TOLERANCE INSPECTIONS FOR REPAIRS AND ALTERATIONS Date: 11/20/07 Initiated by: ANM-100 AC No: 120-93

More information

The Life of Aviation Training

The Life of Aviation Training Chapter 1 The Life of Aviation Training Life is a culmination of the past, an awareness of the present, an indication of a future beyond knowledge, the quality that gives a touch of divinity to matter.

More information

Why assess operational behaviour? Seeing is Believing. Drift into Failure. The Problem with Learning. What to assess?

Why assess operational behaviour? Seeing is Believing. Drift into Failure. The Problem with Learning. What to assess? Why assess operational behaviour? Seeing is Believing Assessing CRM Competencies Dr. Nicklas Dahlstrom Human Factors Manager The Problem with Learning Drift into Failure Normal people doing normal work

More information

MPL. 15POS03 17 June 2014. Introduction. MPL Workshop Conclusions. Basic Flying Skills. Airmanship CRM

MPL. 15POS03 17 June 2014. Introduction. MPL Workshop Conclusions. Basic Flying Skills. Airmanship CRM 15POS03 17 June 2014 Introduction MPL Worldwide, in 2014, there are more than 22 certified MPL training programmes graduating new co-pilots who have earned the Multi-Crew Pilot License (MPL). These pilots

More information

Flight Operations Briefing Notes

Flight Operations Briefing Notes Flight Operations Briefing Notes I Introduction Strict adherence to suitable standard operating procedures (SOPs) and normal checklists is an effective method to : Prevent or mitigate crew errors; Anticipate

More information

Summary of GAO Cost Estimate Development Best Practices and GAO Cost Estimate Audit Criteria

Summary of GAO Cost Estimate Development Best Practices and GAO Cost Estimate Audit Criteria Characteristic Best Practice Estimate Package Component / GAO Audit Criteria Comprehensive Step 2: Develop the estimating plan Documented in BOE or Separate Appendix to BOE. An analytic approach to cost

More information

California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan

California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan (excerpt of the WASC Substantive Change Proposal submitted to WASC August 25, 2007) A. Annual

More information

The Development and Testing of NOTECHS: Non-Technical Skills for Airline Pilots

The Development and Testing of NOTECHS: Non-Technical Skills for Airline Pilots The Development and Testing of NOTECHS: Non-Technical Skills for Airline Pilots Rhona Flin Industrial Psychology Research Centre University of Aberdeen This paper provides back ground information on the

More information

Update on Current Corporate Aviation Accidents. Robert L. Sumwalt NTSB Board Member April 20, 2011

Update on Current Corporate Aviation Accidents. Robert L. Sumwalt NTSB Board Member April 20, 2011 Update on Current Corporate Aviation Accidents Robert L. Sumwalt NTSB Board Member April 20, 2011 The Board The investigators Corporate Aviation / Part 135 Fatal Accidents since last CASS Accident Date

More information

Crew Resource Management (CRM)

Crew Resource Management (CRM) King Schools Online Internet Learning Programs Crew Resource Management (CRM) Syllabus King Schools, Inc. 3840 Calle Fortunada San Diego, CA 92123 800-854-1001 (USA) 858-541-2200 (Worldwide) www.kingschoolsonline.com

More information

Student Academic Achievement Committee (SAAC) Standardized Report Form

Student Academic Achievement Committee (SAAC) Standardized Report Form Student Academic Achievement Committee (SAAC) Standardized Report Form INTRODUCTION Program/Discipline Title: Aviation Maintenance Technology Time Period: spring 2011, summer 2011, fall 2011 Program goals,

More information

THE FLIGHT OPTIONS APPROACH TO SAFETY. Fractional Membership Jet Card

THE FLIGHT OPTIONS APPROACH TO SAFETY. Fractional Membership Jet Card THE FLIGHT OPTIONS APPROACH TO SAFETY Fractional Membership Jet Card A s the premier provider of private jet travel, Flight Options number one priority is the safety of our customers and our employees.

More information

Academic Program Assessment Plan Graduate Programs Hasan School of Business CSU-Pueblo

Academic Program Assessment Plan Graduate Programs Hasan School of Business CSU-Pueblo Academic Program Assessment Plan Graduate Programs Hasan School of Business CSU-Pueblo Identification This is the assessment plan for graduate programs at the Hasan School of Business (HSB) at Colorado

More information

GAO AVIATION SAFETY. FAA s Use of Emergency Orders to Revoke or Suspend Operating Certificates

GAO AVIATION SAFETY. FAA s Use of Emergency Orders to Revoke or Suspend Operating Certificates GAO United States General Accounting Office Testimony Before the Subcommittee on Aviation, Committee on Transportation and Infrastructure, House of Representatives For Release on Delivery Expected at 9:30

More information

Threat and Error Management

Threat and Error Management Threat and Error Management Society of Experimental Test Pilots April 28, 2009 Robert Sumwalt, Board Member NTSB Threat and Error Management: A Practical Perspective Building a wall How do we improve safety?

More information

SA 530 AUDIT SAMPLING. Contents. (Effective for audits of financial statements for periods beginning on or after April 1, 2009) Paragraph(s)

SA 530 AUDIT SAMPLING. Contents. (Effective for audits of financial statements for periods beginning on or after April 1, 2009) Paragraph(s) SA 530 AUDIT SAMPLING (Effective for audits of financial statements for periods beginning on or after April 1, 2009) Contents Introduction Paragraph(s) Scope of this SA... 1 2 Effective Date... 3 Objective...

More information

2. SUMMER ADVISEMENT AND ORIENTATION PERIODS FOR NEWLY ADMITTED FRESHMEN AND TRANSFER STUDENTS

2. SUMMER ADVISEMENT AND ORIENTATION PERIODS FOR NEWLY ADMITTED FRESHMEN AND TRANSFER STUDENTS Chemistry Department Policy Assessment: Undergraduate Programs 1. MISSION STATEMENT The Chemistry Department offers academic programs which provide students with a liberal arts background and the theoretical

More information

AIRLINE SAFETY AND FEDERAL AVIATION ADMINISTRATION EXTENSION ACT OF 2010

AIRLINE SAFETY AND FEDERAL AVIATION ADMINISTRATION EXTENSION ACT OF 2010 PUBLIC LAW 111 216 AUG. 1, 2010 AIRLINE SAFETY AND FEDERAL AVIATION ADMINISTRATION EXTENSION ACT OF 2010 VerDate Nov 24 2008 15:43 Aug 11, 2010 Jkt 089139 PO 00216 Frm 00001 Fmt 6579 Sfmt 6579 E:\PUBLAW\PUBL216.111

More information

Manual on Training Needs Assessment

Manual on Training Needs Assessment Project on Improvement of Local Administration in Cambodia What is Training Needs Assessment? Five Steps of Training Needs Assessment Step 1: Identify Problem and Needs Step 2: Determine Design of Needs

More information

Utilizing the Capstone Experience to Assess a MBA Program: Case Analyses, Groups, and Rubrics

Utilizing the Capstone Experience to Assess a MBA Program: Case Analyses, Groups, and Rubrics Utilizing the Capstone Experience to Assess a MBA Program: Case Analyses, Groups, and Rubrics Floyd G. Willoughby, Oakland University James Suhay, Oakland University Assessment is fast becoming a requirement

More information

TIPS DATA QUALITY STANDARDS ABOUT TIPS

TIPS DATA QUALITY STANDARDS ABOUT TIPS 2009, NUMBER 12 2 ND EDITION PERFORMANCE MONITORING & EVALUATION TIPS DATA QUALITY STANDARDS ABOUT TIPS These TIPS provide practical advice and suggestions to USAID managers on issues related to performance

More information

2. APPLICABILITY. This AC provides information for any person who engages in public aircraft operations (PAO) as defined by the statute.

2. APPLICABILITY. This AC provides information for any person who engages in public aircraft operations (PAO) as defined by the statute. U.S. Department of Transportation Federal Aviation Administration Advisory Circular Subject: Public Aircraft Operations Date: 2/12/14 Initiated by: AFS-800 AC No: 00-1.1A Change: 1. PURPOSE. This advisory

More information

Northwestern Michigan College Supervisor: Employee: Department: Office of Research, Planning and Effectiveness

Northwestern Michigan College Supervisor: Employee: Department: Office of Research, Planning and Effectiveness General Information Company: Northwestern Michigan College Supervisor: Employee: Department: Office of Research, Planning and Effectiveness Employee No: Detail Job Title: Coordinator of Data Reporting

More information

Develop Project Charter. Develop Project Management Plan

Develop Project Charter. Develop Project Management Plan Develop Charter Develop Charter is the process of developing documentation that formally authorizes a project or a phase. The documentation includes initial requirements that satisfy stakeholder needs

More information

Federal Aviation Administration. Kathy Abbott and Robert Burke Federal Aviation Administration 4 February 2015

Federal Aviation Administration. Kathy Abbott and Robert Burke Federal Aviation Administration 4 February 2015 Operational Use of Flight Path Management Systems: Status of Recommendations of the Performance-Based Operations Aviation Rulemaking Committee (PARC)/ Commercial Aviation Safety Team (CAST) Flight Deck

More information

Safety Verification of the Small Aircraft Transportation System Concept of Operations

Safety Verification of the Small Aircraft Transportation System Concept of Operations Safety Verification of the Small Aircraft Transportation System Concept of Operations Victor Carreño 1 NASA Langley Research Center, Hampton, Virginia, 23681 César Muñoz 2 National Institute of Aerospace,

More information

T AB LE OF C ONT EN TS

T AB LE OF C ONT EN TS T AB LE OF C ONT EN TS PART 1 - PHILOSOPHY OF STUDY AND SELECTION OF PARTICIPANTS Background 1 Developing the Proficiency Model. 2 Selection of Carriers.. 3 Selection of Carriers for On-site Visits.. 3

More information

Classification Appeal Decision Under section 5112 of title 5, United States Code

Classification Appeal Decision Under section 5112 of title 5, United States Code Classification Appeal Decision Under section 5112 of title 5, United States Code Appellant: Agency classification: Organization: OPM decision: OPM decision number: [name] Program Support Assistant GS-303-7

More information

CRM Training. UK AltMOC. 3.1 Acceptable Means of Compliance (AMC) and Guidance Material (GM) (Draft EASA Decisions)

CRM Training. UK AltMOC. 3.1 Acceptable Means of Compliance (AMC) and Guidance Material (GM) (Draft EASA Decisions) UK CAA Flight Operations CRM Training UK AltMOC 3.1 Acceptable Means of Compliance (AMC) and Guidance Material (GM) (Draft EASA Decisions) 3.1.2 Air operations Decision 2012/017/R (Part-ORO) SUBPART FC

More information

Solvency II Data audit report guidance. March 2012

Solvency II Data audit report guidance. March 2012 Solvency II Data audit report guidance March 2012 Contents Page Introduction Purpose of the Data Audit Report 3 Report Format and Submission 3 Ownership and Independence 4 Scope and Content Scope of the

More information

Aviation Accreditation Board International Response to the Notice of Proposed Rulemaking

Aviation Accreditation Board International Response to the Notice of Proposed Rulemaking Introduction: The Aviation Accreditation Board International (AABI) is pleased to submit this response to the NPRM. This submittal will be sent both electronically on the FAA Web site and by package courier.

More information

See Appendix A for the petition submitted to the FAA describing the proposed operations and the regulations that the petitioner seeks an exemption.

See Appendix A for the petition submitted to the FAA describing the proposed operations and the regulations that the petitioner seeks an exemption. October 30, 2015 Exemption No. 13453 Regulatory Docket No. FAA 2015 2819 Ms. Joanne Williamson Hawaiian Electric Companies 820 Ward Avenue Honolulu, HI 96814 Dear Ms. Williamson: This letter is to inform

More information

Understanding the Entity and Its Environment and Assessing the Risks of Material Misstatement

Understanding the Entity and Its Environment and Assessing the Risks of Material Misstatement Understanding the Entity and Its Environment 1667 AU Section 314 Understanding the Entity and Its Environment and Assessing the Risks of Material Misstatement (Supersedes SAS No. 55.) Source: SAS No. 109.

More information

Answers to Review Questions

Answers to Review Questions Tutorial 2 The Database Design Life Cycle Reference: MONASH UNIVERSITY AUSTRALIA Faculty of Information Technology FIT1004 Database Rob, P. & Coronel, C. Database Systems: Design, Implementation & Management,

More information

CODE OF CONDUCT FUNDAMENTALS FOR CREDIT RATING AGENCIES

CODE OF CONDUCT FUNDAMENTALS FOR CREDIT RATING AGENCIES CODE OF CONDUCT FUNDAMENTALS FOR CREDIT RATING AGENCIES THE TECHNICAL COMMITTEE OF THE INTERNATIONAL ORGANIZATION OF SECURITIES COMMISSIONS DECEMBER 2004 CODE OF CONDUCT FUNDAMENTALS FOR CREDIT RATING

More information

TITLE 23: EDUCATION AND CULTURAL RESOURCES SUBTITLE A: EDUCATION CHAPTER I: STATE BOARD OF EDUCATION SUBCHAPTER b: PERSONNEL

TITLE 23: EDUCATION AND CULTURAL RESOURCES SUBTITLE A: EDUCATION CHAPTER I: STATE BOARD OF EDUCATION SUBCHAPTER b: PERSONNEL ISBE 23 ILLINOIS ADMINISTRATIVE CODE 50 Section 50.10 Purpose 50.20 Applicability 50.30 Definitions TITLE 23: EDUCATION AND CULTURAL RESOURCES : EDUCATION CHAPTER I: STATE BOARD OF EDUCATION : PERSONNEL

More information

1 (a) Audit strategy document Section of document Purpose Example from B-Star

1 (a) Audit strategy document Section of document Purpose Example from B-Star Answers Fundamentals Level Skills Module, Paper F8 (IRL) Audit and Assurance (Irish) June 2009 Answers 1 (a) Audit strategy document Section of document Purpose Example from B-Star Understanding the entity

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE SECRETARY OF THE AIR FORCE AIR FORCE INSTRUCTION 11-290 15 OCTOBER 2012 Flying Operations COCKPIT/CREW RESOURCE MANAGEMENT PROGRAM COMPLIANCE WITH THIS PUBLICATION IS MANDATORY ACCESSIBILITY:

More information

ASSURE UAS Research and Development Program Research Abstract FAA Research Requirement: UAS Research Focus Area: EXECUTIVE SUMMARY:

ASSURE UAS Research and Development Program Research Abstract FAA Research Requirement: UAS Research Focus Area: EXECUTIVE SUMMARY: ASSURE UAS Research and Development Program Research Abstract FAA Research Requirement: UAS Maintenance, Modification, Repair, Inspection, Training, and Certification Considerations UAS Research Focus

More information

The Challenges of Medical Events in Flight

The Challenges of Medical Events in Flight The Challenges of Medical Events in Flight By Paulo Alves, MD, MSc and Heidi MacFarlane, MEd A MedAire-Sponsored Paper In-flight medical events (IFMEs) represent a challenge for airlines. The problem starts

More information

Audit Sampling. HKSA 530 Issued July 2009; revised July 2010

Audit Sampling. HKSA 530 Issued July 2009; revised July 2010 HKSA 530 Issued July 2009; revised July 2010 Effective for audits of financial statements for periods beginning on or after 15 December 2009 Hong Kong Standard on Auditing 530 Audit Sampling COPYRIGHT

More information

Measurement Information Model

Measurement Information Model mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides

More information

Office of Inspector General. Audit Report FAA DELAYS IN ESTABLISHING A PILOT RECORDS DATABASE LIMIT AIR CARRIERS ACCESS TO BACKGROUND INFORMATION

Office of Inspector General. Audit Report FAA DELAYS IN ESTABLISHING A PILOT RECORDS DATABASE LIMIT AIR CARRIERS ACCESS TO BACKGROUND INFORMATION Office of Inspector General Audit Report FAA DELAYS IN ESTABLISHING A PILOT RECORDS DATABASE LIMIT AIR CARRIERS ACCESS TO BACKGROUND INFORMATION Federal Aviation Administration Report Number: AV-2015-079

More information

Section VI Principles of Laboratory Biosecurity

Section VI Principles of Laboratory Biosecurity Section VI Principles of Laboratory Biosecurity Since the publication of the 4th edition of BMBL in 1999, significant events have brought national and international scrutiny to the area of laboratory security.

More information

U.S. DEPARTMENT OF TRANSPORTATION FEDERAL AVIATION ADMINISTRATION. National Policy. SUBJ: OpSpec A021, Helicopter Air Ambulance (HAA) Operations

U.S. DEPARTMENT OF TRANSPORTATION FEDERAL AVIATION ADMINISTRATION. National Policy. SUBJ: OpSpec A021, Helicopter Air Ambulance (HAA) Operations NOTICE U.S. DEPARTMENT OF TRANSPORTATION FEDERAL AVIATION ADMINISTRATION National Policy N 8900.A021 Effective Date: XX/XX/XX Cancellation Date: XX/XX/XX SUBJ: OpSpec A021, Helicopter Air Ambulance (HAA)

More information

4 Testing General and Automated Controls

4 Testing General and Automated Controls 4 Testing General and Automated Controls Learning Objectives To understand the reasons for testing; To have an idea about Audit Planning and Testing; To discuss testing critical control points; To learn

More information

Proposed Consequential and Conforming Amendments to Other ISAs

Proposed Consequential and Conforming Amendments to Other ISAs IFAC Board Exposure Draft November 2012 Comments due: March 14, 2013, 2013 International Standard on Auditing (ISA) 720 (Revised) The Auditor s Responsibilities Relating to Other Information in Documents

More information

Example Application of Aircrew Incident Reporting System (AIRS)

Example Application of Aircrew Incident Reporting System (AIRS) Example Application of Aircrew Incident Reporting System (AIRS) Prepared by: Jean-Jacques Speyer Director Operational Evaluation, Human Factors and Communication Airbus 1 Rond Point Bellonte 31707 Blagnac

More information

TITLE 23: EDUCATION AND CULTURAL RESOURCES SUBTITLE A: EDUCATION CHAPTER I: STATE BOARD OF EDUCATION SUBCHAPTER b: PERSONNEL

TITLE 23: EDUCATION AND CULTURAL RESOURCES SUBTITLE A: EDUCATION CHAPTER I: STATE BOARD OF EDUCATION SUBCHAPTER b: PERSONNEL ISBE 23 ILLINOIS ADMINISTRATIVE CODE 50 Section 50.10 Purpose 50.20 Applicability 50.30 Definitions TITLE 23: EDUCATION AND CULTURAL RESOURCES : EDUCATION CHAPTER I: STATE BOARD OF EDUCATION : PERSONNEL

More information

OPERATIONS CIRCULAR. OC NO 2 OF 2014 Date: 1 st May 2014. Continuous Descent Final Approach (CDFA) 1. PURPOSE

OPERATIONS CIRCULAR. OC NO 2 OF 2014 Date: 1 st May 2014. Continuous Descent Final Approach (CDFA) 1. PURPOSE GOVERNMENT OF INDIA CIVIL AVIATION DEPARTMENT DIRECTOR GENERAL OF CIVIL AVIATION OC NO 2 OF 2014 Date: 1 st May 2014 OPERATIONS CIRCULAR Subject: Continuous Descent Final Approach (CDFA) 1. PURPOSE This

More information

Glossary of Terms Ability Accommodation Adjusted validity/reliability coefficient Alternate forms Analysis of work Assessment Battery Bias

Glossary of Terms Ability Accommodation Adjusted validity/reliability coefficient Alternate forms Analysis of work Assessment Battery Bias Glossary of Terms Ability A defined domain of cognitive, perceptual, psychomotor, or physical functioning. Accommodation A change in the content, format, and/or administration of a selection procedure

More information

INTERNATIONAL STANDARD ON AUDITING 530 AUDIT SAMPLING

INTERNATIONAL STANDARD ON AUDITING 530 AUDIT SAMPLING INTERNATIONAL STANDARD ON 530 AUDIT SAMPLING (Effective for audits of financial statements for periods beginning on or after December 15, 2009) CONTENTS Paragraph Introduction Scope of this ISA... 1 2

More information

POST-ACCIDENT FLIGHTCREW REPRESENTATION in a Nutshell. Pilot Testimony in Accident Cases SOURCE - Board 1982; AMENDED - Executive Board May 1987

POST-ACCIDENT FLIGHTCREW REPRESENTATION in a Nutshell. Pilot Testimony in Accident Cases SOURCE - Board 1982; AMENDED - Executive Board May 1987 POST-ACCIDENT FLIGHTCREW REPRESENTATION in a Nutshell Prepared by: Kevin Fitzpatrick ALPA Representation Department Staff Meeting June 5, 1997 ALPA Policy ALPA Administrative Manual Section 80, Paragraph

More information

Mauro Calvano. About Aviation Safety Management Systems

Mauro Calvano. About Aviation Safety Management Systems Mauro Calvano About Aviation Safety Management Systems January 2003 1 INTRODUCTION In order to be aware of the factors that are driving the accident rate during the last decade, we must identify the hazards

More information

The Myth of the Unstable Approach

The Myth of the Unstable Approach The Myth of the Unstable Approach Dr Ed Wischmeyer Embry-Riddle Aeronautical University, USA Author Biography: Dr. Wischmeyer has 6 years experience in direct flight safety research, 5 years experience

More information

Flight Review. The flight review is required by Federal Aviation Regulations for all pilots who intend to act as pilot in command of an aircraft.

Flight Review. The flight review is required by Federal Aviation Regulations for all pilots who intend to act as pilot in command of an aircraft. S A F E T Y A D V I S O R Regulations No. 2 Pilot s Guide to the Flight Review This Safety Advisor provides guidance to pilots and flight instructors for the conduct of flight reviews and to convey current

More information