3. Introduction to Competency Assessment Systems

3.1 Assessment system attributes

There are many desired qualities of a competency assessment system. A few important ones are defined here.

compliance with quality management system - the Toolkit follows quality management system (QMS) principles. When applied, the Toolkit will fit directly into an existing or a future QMS. The goal is to describe processes and to improve quality over time. Collection of evidence and documentation of results is critical in competency assessment as it is for any QMS.

authenticity - any assessment system must be authentic; that is, it must reflect the true job done by the AMP. Tools are designed to be used end to end. In many cases, competency criteria must be assessed collectively. For example, an AMO would never observe a thunderstorm without also observing cloud, wind and other parameters. To be authentic, the assessment must follow a real approach.

repeatability - the assessment system should be repeatable as much as possible. Randomness should be removed as much as is practical. This will ensure a common and consistent level of competence.

fairness - the assessment system should be well understood by all those subject to it. It should be connected to remedial training or experience in cases where employees are not yet competent. Shortcomings should be explained in particular, why the shortcoming is important for the job.

validity - the assessment system should accurately measure individuals against the relevant competency standard at the minimum satisfactory level of performance.

3.2 Types of Tool

There are several different types of tool which are intended as examples to be adapted for use in different locations. The main types are

  • direct observation of job responsibilities
  • traditional or multiple choice tests
  • experiential questions
  • simulations
  • portfolio evidence

    Evidence is defined as information, data, materials, or documentation that supports inferences, conclusions, or judgments. It can be objective or subjective, and can take many forms, including written papers, documents or cases demonstrating the results/activities connected to operational simulations, supervisor comments, or self-assessment. Evidence must be relevant, representative, repeatable and verifiable, and it ensures the traceability of the process as part of QMS.

    3.3 Tool Descriptions

    Direct observation - Direct observation evaluates the individual performing a task in real time. It enables the assessment of actual processes employed by individuals undertaking operational activities. Performance assessments examine AMPs actual application of knowledge to solve problems. The assessment focuses on the process and the outcome.

    Direct observation can be done in a real job environment or in a different environment using simulators (see "Simulations" below). Because direct observation is limited to whatever is occurring at the time, it is usually combined with other tools to cover parts of the job that occur less commonly.

    Advantages:

  • When combined with other tools, offers a complete image of the competence of the assessee
  • Is the most authentic tool

    Disadvantages:

  • Weather can rarely be the same during the assessment of AMPs, making it harder to ensure consistency across many AMPs.
  • Unlikely to be suitable for assessing rare events.

    Tests - Tests are a more traditional method of assessing knowledge. In some cases in the Toolkit, knowledge is used as a substitute for competence to make the assessment practical and cost-effective. Many styles of tests can be used including multiple choice, short answers and more open-ended questions.

    Advantages:

  • Can deal with some unusual aspects of competence that couldnt be covered through direct observation
  • Are repeatable

    Disadvantage:

  • Being focused on knowledge, can be very difficult to build as a test of competence

    Experiential Questions - Experiential questions are like tests but the question asked is of the form "what would you do if?". Experiential questions can be posed in written or oral situations. In the case of verbal questions, it is very important that answers and results are documented.

    Advantages:

  • Are a very useful tool in completing the competence image built through direct observation covering the aspects that couldn't be observed.

    Disadvantage:

  • Not as authentic as direct observation

    Simulations - A simulation is where the forecaster or observer is given a real or hypothetical situation and asked to respond as if he or she were on the job. Simulations range from simple questions, such as experiential questions, to full operational simulators.

    Advantage:

  • Can cover any of the aspects that were not covered through direct observation or even replace this tool depending on complexity

    Disadvantages:

  • Difficult and time consuming to create the scenario
  • Building a library of case studies is time consuming, and may be difficult to maintain

    Portfolio - A portfolio contains evidence of knowledge, ability or competence based on past experiences. Portfolio evidence can be powerful in demonstrating competency as it provides clear evidence of what an individual has done. A portfolio could even describe unsuccessful examples along with remedial work that the AMP has done to remedy the deficiency.

    Advantages:

  • Promotes self-evaluation, reflection, and critical thinking
  • Measures performance based on genuine samples of AMP's work

    Disadvantages:

  • May cover only the aspects the assessee selects depending on how the portfolio is developed
  • Can be time consuming for the assessee to collate, presenting challenges if they have not forecast/observed a particular situation historically
  • Can be time consuming for assessors to mark if lots of people require assessment

    3.4 Using multiple tools to build an assessment framework

    Most competency assessments will be done using direct observational techniques. Of the tools available, direct observation is the most authentic and is used for most performance criteria which can be observed frequently. Other techniques, such as tests or experiential questions, will often be needed to provide evidence of competence for seasonal or rare events. A well-developed competency assessment system will use multiple tools and methods to demonstrate that personnel meet the required competencies.

    3.5 Competency Assessment Matrix

    Table 1 shows a competency assessment matrix for aeronautical meteorological observers (AMO) while Table 2 is a matrix for aeronautical meteorological forecasters (AMF). The specific performance criteria are listed in the left column while the other columns represent one tool or type of tool.

    The check mark (red tick) indicates that the specific criterion could be assessed using the tool as provided and that this tool is recommended. The lightning icon (icon) indicates that a specific performance criterion might be assessed on a given day with the (direct observation) tool.  The snowflake icon (snoe) signifies that this type of tool could be developed to assess the specific criterion. 


    Competency

    Tool 23

    Tool 25

    Tool 22

    Tool 21

    Tool 24

    1 Analyze and Describe weather

     

     

     

     

     

    2.1 general observing process

    red tick

     

     

     

     

    2.1 wind

    red tick

     

     

     

     

    2.1 visibility, RVR and vertical vis

    icon

     

     

    red tick

     

    2.1 significant wx phenomena

    icon

     

     

     

     

    2.1 cloud type, ceiling

    icon

     

     

     

     

    2.1 temperature

    red tick

     

     

    red tick

     

    2.1 pressure

    red tick

     

     

     

     

    2.1 other phenomena

    icon

     

     

     

     

    2.2 interpret various sensors

    red tick

     

     

     

     

    2.3 issue obs on time

    red tick

     

     

     

     

    2.3 issue obs in correct format

    red tick

     

    red tick

     

     

    2.3 obs are within amend criteria

    red tick

     

    red tick

     

     

    3.1 apply QMS

     

     

    red tick

     

     

    3.2 quality check observations

     

     

     

     

     

    3.3 correct errors

     

     

    red tick

     

     

    3.4 monitor operational systems

     

     

    red tick

     

     

    4.1 ensure dissemination of obs

     

    red tick

     

     

     

    4.2 presentation of met info

     

     

     

     

    red tick

    4.3 alert forecasters to sig wx

     

    red tick

     

     

     

    Table 1:  Competency Assessment Matrix for Aeronautical Meteorological Observers.  Links lead to the specific tools as well as details of the performance criteria.  


    Competency

    Tool 3

    Tool 5

    Tool 1

    Tool 2

    Tool 6

    Tool 4

    Tool 7

    1.1 analysis

    red tick

     

     

     

     

     

     

    1.1 diagnosis

    red tick

     

     

     

     

     

     

    1.2 monitor situation

    red tick

     

     

     

     

     

     

    1.3 assess need for amd

    red tick

     

     

    red tick

    red tick

     

    red tick

    2.1 general forecast process

    red tick

     

     

     

     

     

     

    2.1 temperature/humidity

    red tick

     

     

     

     

     

     

    2.1 wind

    red tick

     

     

     

    red tick

     

     

    2.1 pressure

    red tick

     

     

     

     

     

     

    2.1 cloud

    icon

     

    snoe

     

    snoe

     

    snoe

    2.1 precipitation

    icon

     

    snoe

     

    snoe

     

    snoe

    2.1f reduced visibility

    icon

     

    snoe

     

    snoe

     

    snoe

    2.1 obstructions to visibility

    icon

     

    snoe

     

    snoe

     

    snoe

    2.1 thunderstorms

    icon

     

    snoe

     

    snoe

     

    red tick

    2.1 turbulence

    icon

     

    red tick

     

    snoe

     

    snoe

    2.1 icing

    icon

     

    snoe

     

    snoe

     

    snoe

    2.1 wake vortex

    icon

     

    snoe

     

    snoe

     

    snoe

    2.2 forecast on time

    red tick

     

     

     

     

     

     

    2.2 correct format

    red tick

     

     

    red tick

     

    red tick

     

    2.2 within amend criteria

    red tick

     

     

    red tick

     

    red tick

     

    2.3 monitor adjacent forecasts/warnings

    red tick

     

     

     

     

    red tick

    red tick

    2.3 liaise with adjacent regions

    red tick

     

     

     

     

    red tick

    red tick

    3.1 severe thunderstorms

    icon

     

    snoe

     

    snoe

     

    red tick

    3.1 severe turbulence

    icon

     

    red tick

     

    snoe

     

    snoe

    3.1 severe wind and shear

    icon

     

    snoe

     

    red tick

     

    snoe

    3.1 severe icing

    icon

     

    snoe

     

    snoe

     

    snoe

    3.1 cloud below aerodrome minima

    icon

     

    snoe

     

    snoe

     

    snoe

    3.1 hazardous phenomena

    icon

     

    snoe

     

    snoe

     

    snoe

    3.1 sand/dust storms

    icon

     

    snoe

     

    snoe

     

    snoe

    3.1 volcanic ash

    icon

    red tick

    snoe

     

    snoe

     

    snoe

    3.1 tropical cyclones

    icon

     

    snoe

     

    snoe

     

    snoe

    3.1 radioactive cloud

    icon

     

    snoe

     

    snoe

     

    snoe

    3.2 warnings issued on time

    red tick

     

     

     

     

     

     

    3.2 correct format

     

     

     

    red tick

     

    red tick

     

    3.2 warnings meet update criteria

     

     

     

    red tick

     

    red tick

     

    3.3 monitor adjacent forecasts/warnings

    red tick

     

     

     

     

    red tick

     

    3.3 liaise with adjacent regions

    red tick

     

     

     

     

    red tick

     

    4.1 apply QMS

     

     

     

    red tick

     

     

     

    4.2 errors/unrepresentative obs

    red tick

     

     

     

     

     

     

    4.3 validates information

     

     

     

     

     

     

     

    4.4 monitor systems

    red tick

     

     

     

     

     

     

    5.1 ensure dissemination

    red tick

     

     

     

     

     

     

    5.2 briefing

    red tick

     

     

     

     

    red tick

    red tick

    Table 2:  Competency Assessment Matrix for Aeronautical Meteorological Forecasters.  Links lead to the specific tools as well as details of the performance criteria.