Assessment Policy & Procedure

Assessment Policy & Procedure


Policy Statement
TrainSmart Australia is committed to ensuring that the process of assessment (including the recognition of prior learning) in all VET programs is conducted in accordance with the four Principles of Assessment (assessments must be valid, reliable, flexible and fair) and the Rules of Evidence (evidence must be valid, current, sufficient and authentic) as endorsed by the National Quality Council.

Responsibilities: 
The National Training Compliance Manager is responsible for ensuring:

  • the policy is updated and distributed as required

Assessment Procedure

Assessment Planning/Development

Purchased material

In the case of materials being purchased “off the shelf”:

  • The National Training Compliance Manager will source materials appropriate for the relevant qualification and present those materials for review to the Faculty Team Leader.
  • A Curriculum Review Team will be appointed by the Faculty Team Leader in consultation with the National Training Services Manager to evaluate, contextualize, alter (make the necessary adjustments as appropriate) and map resources prior to use
  • Curriculum Review Team will save the revised documents as a new version
  • Once the Faculty Head emails the National Training Compliance Manager that the materials have been evaluated, the National Training Compliance Manager will inform the National IT Manager to archive the old documents
  • National IT Manager will create the capsule resources (workbook, learning activities, assessment tasks) and upload to the relevant UoC Learning Capsule in Moodle for use by trainers and students
  • National IT Manager will release UoC in Moodle and advise faculty heads and relevant trainers by email when completed

National IT Manager will update appropriate spreadsheets.

In-house materials

An Assessment Development Team will be appointed by the National Training Services Manager to develop the Assessment Pack for a given unit of competency. The Assessment Development Team uses the Unit of Competency Details published on the training.gov.au website to create the assessment instruments for that unit of competency.

The appointed Assessment Development Team will follow TrainSmart Australia’s assessment development protocol and:

  • be subject matter experts for that unit of competency, and
  • have attended a briefing session on how to map assessment instruments to the mandatory requirements of a unit competency

have the necessary training/assessing competencies as determined by the National Skills Standards Council.

In-house materials:

Assessment Instruments and Evidence Collection

Appendix One of this document outlines the steps taken and considerations made in planning the Assessment Process for development of each Assessment Pack.

A variety of assessment instruments (evidence gathering techniques) will be used for evidence collection that are appropriate to the context of the UoC and offer flexibility in learning styles.

Evidence in this case is defined as the information gathered to support a judgement of competence against the specifications of the relevant unit/s of competency.

Assessment instruments may include (but are not limited to):

  • knowledge based questions
  • written responses to scenarios
  • direct observation of work performance
  • evaluation of completed case studies and/or reports
  • documentary evidence of competed work
  • demonstration of skills
  • project work.

The Developer for each UoC will ensure that appropriate evidence is collected (quality and quantity) to the extent that all aspects of competency have been satisfied and that competency can be demonstrated repeatedly.

The final assessment pack for each UoC will cover a broad range of skills and knowledge that are essential for competent performance in the workplace. Knowledge and skill development are integrated with practical application and judgement of competence is based on a variety of evidence gathering techniques.

Where appropriate a number of units of competency may be grouped together for holistic assessment within one assessment pack.

TrainSmart also undertakes the contextualization and customization of assessment tasks to meet the needs of specific client groups:

  • unemployed clients will have assessment tasks that are case study rather than workplace based

for employed client groups: an industry representative and relevant TrainSmart training and assessing staff will work towards the tailoring of training and assessment materials to better suit the workplace needs of the client group.

In House material:

Newly Developed Assessment Pack Evaluation and Sign-off

Prior to the use of a newly developed Assessment Pack a review of the Assessment Pack will be conducted.

For each newly developed Assessment Pack, the Validator will use an Assessment Evaluation Checklist to ensure that the assessment instruments and evidence collection within the assessment pack:

  1. can be adequately mapped against the current Unit of Competency Details published on the National Register, and
  2. comply with the Principles of Assessment in terms of being valid, reliable, fair and flexible.

The evaluator will then sign off the Assessment Pack Evaluation Checklist for that unit of competency, indicating whether the Assessment Pack is “ready for use” or “requires further modification” by an appointed Assessment Developer.

In house materials:

Evidence to support planning and development of assessment packs

 

Evidence includes:

  • Assessment Pack Template
  • Assessment Validation Checklist (template)
  • Work placement activities and assessments template
  • Mapping document
  • Industry consultation documents
  • Industry Consultation Register
  • Master copy of Assessment Pack and accompanying Assessment Pack

Evaluation Checklist (for each UoC)

Conducting Assessment

Information provided to clients

Each learner will complete a set of forms that make up their Learner Profile at the beginning of each program. The Learner Profile aims to identify any special needs, so that reasonable adjustments can then be made to training and assessing methods, where necessary, to enable learners with special needs to demonstrate competence.

The Trainer in charge of each unit of competency will ensure that the method, context and timing of the assessment process are made clear to all learners at the commencement of training for each UoC. This is done during the Induction process and is also detailed within the assessment pack for each UoC.

The Trainer will also explain to learners that to accept evidence as authentic, an assessor must be assured that the evidence presented for assessment is the candidate’s own work. To this purpose Turnitin is used to assess plagiarism within a learner’s assessment submission.  Competency requires demonstration of current performance, so the evidence collected must be from either the present or the very recent past.

Every learner is notified of the opportunity to appeal the assessment result and/or be reassessed if necessary, as detailed in TrainSmart Australia’s Right to Appeal Policy.

Recognition of Prior Learning

Full details of assessment submission are described in TrainSmart Australia’s Procedure for Completed Assessments. This procedure includes the process for feedback from the assessor to the learner, which is constructive, timely and sufficient to inform the learner of any additional evidence/training they require to demonstrate competence.

Assessment Submission

Information on RPL is made available to learners through TrainSmart Australia’s Recognition of Prior Learning Policy.

Opportunity for RPL for a given unit of competency will be discussed between the learner and their Customer Support Officer at the pre-enrolment stage. If appropriate, learners will be issued a RPL kit for a given unit of competency.

Assessment Competencies

At TrainSmart Australia assessment is conducted by persons who:

  1. hold the appropriate competencies as determined by the National Quality Council, and
  2. have the relevant vocational competencies at least to the level being assessed; and
  3. can demonstrate current industry skills directly relevant to the assessment being undertaken; and
  4. continue to develop their VET knowledge and skills as well as their industry currency and assessor competence.

All assessors will have the above criteria detailed within their respective Trainer/Assessor Matrix.

Client feedback on assessment

Clients will be given the opportunity for feedback on their experience of the assessment process on a regular basis by completing the Program Evaluation Form. Full details of client feedback processes are described in TrainSmart Australia’s Feedback Policy and Procedure.

Evidence to support conducting of assessment

Evidence includes:

  • Master copy of Assessment Pack for each UoC
  • Learner Induction Checklist
  • Set of forms that make up a client’s Learner Profile
  • Trainer/Assessor Matrix
  • Recognition of Prior Learning Policy
  • RPL kits for each unit of competency
  • Procedure for Completed Assessments

Procedure for Collecting and Acting on Feedback

Validation & Moderation of Assessment

Evidence to support validation & moderation of assessment

Please refer to TrainSmart Australia’s Validation and Moderation Procedure.

Relevant training and assessing staff are informed of this procedure prior to developing materials, and are responsible for observing the guidelines stated in this procedure.

Appendix One:  Assessment Process

Steps in the planning process can be summarised as follows:

(Source: IBSA User Guide for TAE40110 Certificate IV in Training and Assessment)

  1. What evidence do I need to collect in ‘real’ or realistically simulated environments? What other evidence do I need? How do I ensure it meets the principles of validity?
    Consider:
      1. content – the match between the required knowledge and skills specified in the competency standards and the assessment tool’s capacity to collect such evidence
      2. face validity – face refers to the relationship between assessment tasks and real work based activities; the extent to which the assessment tasks reflect real, work-based activities
      3. construct – the degree to which the evidence collected can be used to infer competence in the intended area, without being influenced by other non-related factors (e.g. literacy levels).

         

  2. How will I get this evidence?
    Consider:
      1. The methods of collection that will form your assessment plan – will the chosen methods actually measure what it purports to measure and is it practical to use? Does the evidence it collects cover the knowledge and skills that are essential to competent performance as set out in the unit of competency?

      2. The benchmark criteria – what marking guides and/or exemplars and assessment records will I need to develop:
          • to ensure fairness and flexibility with reasonable adjustments outlined?
          • to record expected responses, appropriate to the AQF level?
          • the decision-making rules you will implement to define satisfactory responses? For example – what constitutes a satisfactory response? How will you deal with an incomplete response? Will you have a cut-off point; e.g. will 8/10 correct responses be sufficient; will you elicit more information for two responses not given or will you require a reassessment? Your instructions need to be clear to ensure intra/inter-rater reliability.

      3. The detailed mapping of assessment activities against unit requirements, including:
          • elements and performance criteria
          • required Skills and Knowledge
          • critical aspects for assessment and evidence required to demonstrate competency

      4. Recording and reporting documentation, including:
          • summary report and assessor signoff
          • feedback to and from learner
          • organisational requirements met.

             

  3. What processes and tools do I need to implement to achieve this assessment plan?
    Consider:
      1. Learning needs of specific cohort – any support strategies necessary?
      2. Context of learning program – workplace, distance, online, institution, enterprise? What does this mean in terms of:
          • the packaging of qualifications – clusters, electives?
          • facilities and resources – technology, support personnel, subject expertise?
          • the RPL process?
          • the flexibility of delivery?
          • information for students?
          • Source and/or design resources – mode?

             

  4. Link planning for delivery and assessment in a single strategy document.

     

    •  

Policy Version: 3.4
Reviewed and Updated: May 2024
By: National Training and Compliance Manager