Assessment Framework

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 104

TVET CCURRICULUM DEVELOPMENT, ASSESSMENT AND

CERTIFICATION COUNCIL

CBET TRAINING MANUAL


(ZERO DRAFT)

APRIL 2017

Page 1 of 104
1.0 REFORMS IN EDUCATION AND TRAINING

1.1 Introduction

The Ministry of Education is divided into three (3) State departments each in-
charge of one of the three sub-sectors of education and training:

State Department for Basic Education: ECDE, Primary Education,


Secondary Education and Teacher Education

State Department for Vocational and Technical Education: All National


polytechnics, Technical Training Institutes, Institutes of Science &
Technology Colleges and Youth Polytechnics

State Department for University Education: All Public and Private


Universities

The three Education and Training sub-sectors are guided by four (4) Acts of
Parliament as follows:

Basic Education Act, 2013

University Act, 2012

TVET Act No. 29 of 2013

Kenya National Qualifications Framework Act, 2014

These Acts address issues of access, equity, transition rates, relevance,


quality and efficiency in the management of the respective sub sectors.

1.2 Reforms in TVET

According to Sessional Paper No. 4 of 2016 (formerly Sessional Paper No. 14


of 2012):

Training in TVET shall be Competency Based

Curriculum development shall be industry led

Certification should be based on demonstration of competence

Allow for multiple entry and exit in TVET programmes

Page 2 of 104
The TVET reforms are expected to result in a shift from:

Supplyled training to demand-driven training

time-bound, knowledge-based training to flexible and competency-


based training

Examination led certification to competence certification

TVET as last resort to rebranding and reposition TVET as a sub-sector


of choice for trainees

TVET graduates as job seekers to empowered TVET graduates as


creators of employment

CBET A mode of training which emphasizes the acquisition of


competence; Competence is the ability to perform to the set
standards.

1.3 Objectives of the CBET

Promote lifelong learning through progression and transfer

Encourage individuals to achieve their full potential

Develop attitudes and abilities to respond rapidly to change

1.4 Target groups

Those currently educated and trained who need to update their


competences

Individuals in the informal sector that need certification (RPL)

Open to all who can receive instructions

1.5 Role of Various Stakeholders in the Reforms

a. Industry

Develop Occupational Standards through Sector Skills Advisory


Committees (SSACs)

Assist in development and acquisition of training materials

Page 3 of 104
Participate in external assessment of trainees and evaluation of the
CBET systems

Provide opportunities for industrial training and experience

Provide expert workers to participate in the curriculum development


process

Provide external assessors and verifiers of assessment

b. Training providers

Identify training needs by conducting Training Needs Assessments

Provide training tools, equipment, supplies and materials

Provide adequate work stations for the CBET programmes

Recruit and Register of trainees (Private)

Conduct assessment for RPL

c. Provide trainers who:

Continuously assess trainee competence level

Provide career guidance and counselling

d. Provide Internal Verifiers who:

Continuously verify assessment of competence

e. Kenya National Qualification Authority

Regulate all qualifications

Ensure all Kenyan qualifications are Nationally and Internationally


recognised

Establish national qualifications at all levels

Provide pathways and linkages within qualifications

1.6 Conclusion

Implementation of the CBET curriculum will:

Improve employability of TVET graduates


Page 4 of 104
Provide a critical mass of competent TVET graduates for social
economic development

Contribute towards attainment of Vision 2030

Page 5 of 104
2.0 ROLE OF CURRICULUM DEVELOPMENT, ASSESSMENT AND
CERTIFICATION COUNCIL (CDACC) AND STAKEHOLDERS IN THE CBET
SYSTEM

2.1 INTRODUCTION

a. Establishment

The TVET CDACC is established under section 44 (1) of the TVET Act, 2013 to
coordinate assessment, examination and certification of TVET.

b. Mandate

Its mandate is to undertake design and development of Curricula for the


training institutions, examination, assessment and competence certification

c. Vision

To become A centre of excellence in TVET curriculum development,


assessment and certification for a globally competitive labour force

d. Mission

To provide demand driven curricula, competence assessment and


certification of TVET graduates for the global labour market

2.2 Functions of TVET CDACC as per the TVET ACT NO. 29 of 2013

(a) Undertake design and development of curricula for the training


institutions, examination, assessment and competence certification

(b) Make rules with respect to such examinations and competence


assessments

(c) Issue certificates to candidates

(d) Promote recognition of its qualifications in foreign systems

(e) Investigate and determine cases involving indiscipline by candidates


registered with it

(f) Promote and carry out research relating to its examinations

(g) Promote the publication of books and other materials relevant to its
examinations

Page 6 of 104
(h) Anything incidental or conducive to the performance of any of the
preceding functions

2.3 Role of TVET CDACC

1. Establish a curriculum development and review process

2. Establish an assessment and certification process

3. Accredit curriculum developers, assessors, verifiers & assessment


centers

4. Coordinate the validation and endorsement of Competency Based


Curriculum

5. Keep a register of all accredited TVET programmes

6. Appoint Sector Skills Advisory Committees (SSACs)

7. Coordinate the development endorsement and review of Occupational


Standards

8. Grant permission to institutions to conduct Recognition of Prior


Learning (RPL)

assessments

9. Certification of competence

2.4 Linkages between TVET CDACC and other key stakeholders

Page 7 of 104
3.0 OVERVIEW OF THE CBET APPROACH

3.1 Introduction

TVET system in Kenya has faced neglect for a long period of time resulting in
low financing, inequitable participation rate, mismatch between skills
demanded and skills supplied and efficiency of TVET programmes.

These affected the efficiency and effectiveness of TVET system to meet


changing labour markets and social economic demands thus, calling for a
paradigm shift of the whole system.

3.2 Reforms in TVET

The Basic Education Act, 2013, University Act, 2012 and TVET Act 2013 were
all formulated to guide the three sub- sectors of education

Page 8 of 104
These Acts address issues of access, equity, transition rates, relevance,
quality and efficiency in the management of the respective sub sector.

The reforms are expected to result in a shift from:

Supplyled training to demand-driven training;

Time-bound, curriculum-based training to flexible and competency-


based training;

Examination led certification to competence assessment certification;

TVET as damping place for failures to rebranding and reposition TVET


as a sector of choice for trainees;

TVET graduates as job seekers to empowered TVET graduates as


creators of employment.

Limited funding of TVET to significant funding through establishment of


TVET Funding Board.

Limited access and equity to increased access and enhancement of


equity through provision of Loans and bursaries and centralised
admission and placement body.

3.3 The role of DTVET

Ensure that CBET policies and guidelines are disseminated to all key
stakeholders for implementation.

Update policies and procedures in the light of Kenyan experience and


international developments.

Develop and implement national programmes to create awareness and


ownership of CBET policies and guidelines.

3.4 The role of TVETA

Facilitate and regulate quality assurance

Registration of training providers

Quality Management System

Course accreditation

Page 9 of 104
Monitoring and auditing

3.5 Role of TVET-CDACC

Spearhead nationwide campaign to promote the CBET system

Arrange and conduct training for CBET providers and assessors

Source, adapt or develop suitable training CBET curricula.

Develop and disseminate guidelines for preparation of CBET learning


packages, in liaison with other stakeholders.

Page 10 of 104
3.6 Institutional Structure

3.7 CBET

CBET is a mode of training where the emphasis is placed on the


acquisition of competence in performing skills / a task.

Competence is the ability to perform tasks or do work according to set


standards.

It is designed to meet the demands of industry and business.

It involves training individuals to be able to perform to the standards


required in employment

3.7.1 Objectives of the CBET

Establish occupational standards

Train competent individuals with transferrable skills

Page 11 of 104
Link education and training to the skills needed by employers

Establish a quality assurance system which will have the confidence of


all stakeholders

Promote lifelong learning through progression and transfer

Encourage individuals to achieve their full potential

Develop attitudes and abilities to respond rapidly to change

3.7.2 Target groups

Those currently educated and trained who need to update their


competences

Individuals from the informal sector

Individuals from the formal sector

The unemployed population

Out of school youth

Those with little or no education

Those individuals who are made redundant so that they can learn new
competences

Those who have retired but who would like to develop new skills and
competences

3.7.3 Objective Based Curriculum vs. CBET

In the normal teaching Learning process, Objective Based Curriculum is


followed.

Here, the specific objectives are clearly stated in each of the three
learning domains and the achievement is planned and developed in
these domains independently.

Page 12 of 104
In the case of CBET, the competency statements resemble the
objective statements in appearance, but the competency statement
are cutting across the domains.

these competency statements are comprehensive in nature

They describe what the learner will be able to do comprehensively in


the work situation including:

the skill component

underpinning knowledge and

Related attitudinal element.

3.7.4 The Proposed Competency Based Education and Training


System

3.8 The Role of Industry

Set Occupational Standards through appropriate Sector Skills Advisory


Committees (SSACs)

Page 13 of 104
Monitoring standards/providing verifiers

Assisting in the development and acquisition of training materials

Participating in the assessment of trainees and evaluation of the CBET


systems

Providing of opportunities for industrial training and experience of


Industry

3.9 Role of Training Providers

Establishing training needs by carrying out Training Needs


Assessments

Provide competent staff that is familiar with CBET system

Provide the necessary tools, equipment and materials

Assist in the development and acquisition of training materials.

Develop adequate workstations for the modular programme

Recruitment and Registration of trainees

3.10 The Role of Trainers

Assess trainee competence level

Provide for accreditation of prior learning

Guide, support and counsel the trainee in identifying the appropriate


programme route.

Ensure that the trainee is issued with a number that must be


maintained throughout the training period.

3.11 The KNQF

will link together all qualifications

Will ensure that all Kenyan qualifications are nationally and


internationally recognised.

Within the NQF, there shall be TVET qualifications with the levels of
Certificates, Diploma, Higher Diploma and degree as well as short

Page 14 of 104
course/modules which can all be given and level and the learning
quantified in terms of notional learning hours.

TVET CDACC in collaboration with TVETA will work with KNQFA to


develop level descriptors which will enable qualifications to be
considered for placement.

3.12 TVET Qualifications

TVET Qualifications should:

Meet the requirements of industry

Be based on Industry Occupational Standards

Be based on Units containing elements of Competence

Be developed into Learning Modules and written in learning outcomes


which make up a Qualification

Provide flexibility of delivery

Give credit of prior achievement

Enable credit accumulation of completed Modules

Be assessed against National Competence standards

Have Quality Assurance measures to validate standards and


achievement

Develop a strong partnership between Industry and Education

3.13 Upgrading pathways and progression

Page 15 of 104
Page 16 of 104
3.14 Sector Skills Advisory Committees

Will be the technical arm of the TVET CDACC

Responsible for ensuring that vocational education and training


programmes offered are according to the needs and demands of the
employment market.

Advise the TVETA Board and TVET CDACC on the establishment of new
vocational trades training and educational programmes or on the
abolition of existing ones.

Assess training needs and determine standards for the area of trades
the committee is covering.

Will draw up training specifications and job description for the trades to
be taught.

Will ensure coordination with related trade training activities.

Will ensure that the employment market is informed about training


activities under the committees to promote placements for students;
and

Will form sub - committees to work with particular training issues as


may be necessary?

3.15 Occupational Standards Development

Based on the concept of competence which is defined as the ability to


perform the activities within an occupation.

Competence encompasses organisation and planning of work,


innovation and coping with non-routine activities

A Unit of Competency will be made up of a number of Elements of


Competency written as Performance Criteria

Occupational standards are not time bound

They do not attribute the amount of learning time required to achieve


them as it will vary considerably depending on the background of
individuals and their experience.

Page 17 of 104
Figure 5 outlines qualification structure.

Page 18 of 104
3.16 Qualification Structure

3.17 Recognition of Prior Learning

The new CBET system will enable individuals to be assessed regarding


on their prior learning or achievement

however this can be time consuming and requires the system to be


developed fully before this can be achieved.

TCDACC is to develop the process of assessing prior competence.

3.18 Conclusion

Implementation of the CBET curriculum will:

Improve employability of TVET graduates

Contribute to provision of critical mass of skilled human resource


personnel

Contribute to attainment of Vision 2030

Page 19 of 104
4.0 TRAINING NEEDS ASSESSMENT/ ANALYSIS (TNA)

Curriculum development process begins with a training needs assessment/


analysis (TNA). This is an assessment process that serves as a diagnostic tool
for determining what training should to take place. The curriculum initiator
shall carry out a TNA

a) TNA shall among others;


i. Establish how the proposed curriculum addresses industrial,
economic, social and technological needs of the Country.
ii. Determine competencies (knowledge, skills, and attitudes) required
in the identified skill gap area.

iii. Identify the availability and competencies of trainers for the


proposed curriculum.

iv. Assess the availability of resources to implement the proposed


curriculum.

v. Determine the training requirements of trainees with special needs.

vi. Establish the findings and feedback derived from implementation of


the existing curricula in case of a review.

vii. project current and future employment patterns for skilled workers

A successful training needs assessment should identify those who need


training and the what kind of training is needed.

TNA enables one to:

o Identify performance goals and the competencies needed by the


workforce to achieve those goals, and

o Identify gaps in training in existing training programmes

o Direct training resources to areas of greatest priority.

4.1 Role of TNA in Curriculum Development

TNA plays a crucial role in curriculum development. It shall help in:

Identifying the gap between current and required levels of knowledge,


skills and attitude (competency) in the workplace (function).

Page 20 of 104
Identify in general terms, what the content of training should be (broad
picture).

The specific content will be identified by the Occupational Standards.

Forms the foundation of a training plan.

Provides a baseline for the evaluation of a training plan.

Ensures that appropriate and relevant training is delivered.

4.2 TNA proposal

Preceding the TNA survey is the TNA proposal.

TNA proposal

Shall be written in future tense because its an intent to carry out the
survey.

Is a contract document between the person carrying out the TNA and
the organization funding the survey.

Must convince the financier of the survey of the need to commit funds.

4.2.1 Components of the Proposal

a. Back ground:

Must bring out the mismatch between the competencies needed by the
industry and the skills developed through training using Knowledge
Based Learning.

Indicate that changes in technical training policy require that all TVET
training programs be competency based.

b. Statement of the Problem:

Must indicate how the current TVET training programmes have failed to
meet the competence demands of the industry.

Shall be a statement of the problem that exists and the outcomes of


training using the current training programs.

It clearly outlines the approach that will be used to tackle the stated
problem.
Page 21 of 104
It attempts to answer the questions: what, where, when, why and how

c. Objectives

TNA proposal shall have both the general and specific objective of the
survey.

It shall also have the survey questions which are drawn from the
specific objectives.

d. Justification

The initiator must justify why there is need to carry out the survey and
state the consequences of not carrying out the survey.

e. Methodology

It shall provide a detailed discussion of the tools and methods to be


used to collect data.

f. Appendices

These are document attached to the TNA proposal. They include:

The TNA tools-prepared based on the method(s) to be used for data


collection

The budget-a breakdown of the financial requirements for carrying out


the survey

The work plan- shows the duration and timing of activities

etc

4.3 TNA Report

A TNA Report is evidence that there exists a gap that needs to be addressed
through training. The TNA report shall among other things:

Establish how the proposed curriculum addresses Industrial, Economic,


Social and Technological needs of the Country.

Determine competencies (knowledge, skills, attitudes) required in the


identified competency gap area.

Address emerging trends and challenges in the Job market/Economy


Page 22 of 104
Identify the availability and competencies of trainers for the proposed
curriculum

Assess the availability of resources to implement the proposed


curriculum

Determine the training requirements of trainees with special needs

Project current and future employment patterns for competent workers

4.3.1 Components of a TNA Report

A TNA report shall be written in past tense and shall have the following:

Abstract-a brief summary of the TNA report

Background

Problem statement

Objectives

Survey questions

Methodology used

Data analysis and interpretation

Findings

Conclusions

Recommendations

4.3.2 Appendices

The following documents shall be attached to the TNA report:

The TNA tools-used in data collection

The organizations involved-an detailed list of organizations and


persons that responded to the survey

The work plan- shows the duration of the survey

References

Page 23 of 104
4.3.3 Conclusion

The TNA report shall be submitted by the curriculum initiator alongside


the application forms for development of a CBET curriculum.

4.4 TNA TOOLS

The curriculum initiator shall first identifying the respondents and the
methodology for the TNA survey, design relevant data collection tools or
instruments and then carry out a Training Needs Assessment.

TNA tools shall include but not limited to:

Questionnaires
Interview guides
Observation schedules
Document analysis guides

4.4.1 Description

These data collection instruments that consists of:

Close-ended and open-ended items in form of questions or statements.

Close-ended items are structured in such a way that the respondents


are provided with a list of responses from which to select an
appropriate answer. e.g. gender

Male ( ) Female ( )

Open-ended items are constructed in such a way that the respondents


are given freedom to write what they feel is the appropriate answer.
E.g. what is your age? ------

4.4.2 Types of TNA tools and their administration

a. Questionnaire

Description

A questionnaire is a carefully designed instrument (written, typed or printed)


for collecting primary data from respondents.

A questionnaire can be

Page 24 of 104
Self administered questionnaire-the respondent fills the questionnaire in
the absence of the data collector.
Or
Interviewer administered questionnaire-the data collector fills the
questionnaire as the respondents responds.
A self-administered questionnaire may be

Directly delivered to the respondents - the researcher distributes the


instruments personally to the respondents.

Mailed through the post office - questionnaire is sent to participants


through post office. The participant fills in and sends back to the
researcher.

E-mailed the researcher sends the questionnaire using electronic mail.


The respondent fills and sends back through the same E-mail Address.

b. The interview guide

Description

This is a tool for primary data collection used by the interviewer during
the interview as a guide.

It is used during interviews to direct verbal communication between the


researcher and the respondent.

Interview Guides may be structured, unstructured or in-depth.

i) Structured Interview Guide

Contains both close-ended and open-ended items.

Is rigid in that the researcher reads the items as they are without
changing or expounding on the items.

A questionnaire can be used as a structured interview guide whereby the


researcher fills the information while the respondent gives the responses.

ii) Unstructured Interview Guide

Contains open-ended items.

Is flexible in that the researcher is free to re-word as well as expound on


the items.

Page 25 of 104
iii) In-depth Interview Guide

Is a research tool used to collect detailed information by PROBING further


and further using the response given by the respondent

Is flexible because the researcher can make clarification, expound on the


items, change the items should there be need.

Interview Guides may be administered via:

Face-to-face: - the interviewer sits with the interviewee and asks


questions concerning the issues being studied following the items in the
interview guides.

Telephone interview: - the interviewer uses the telephone to interview the


respondents without going physically to the field.

c. Observation guides

Description

Observation guide is a research instrument that guides the researcher in


gathering data from key areas through sight.

This involves looking at the phenomenon, objects or behavior that are


indicated in the instrument from which meaning are extracted or
analyzed.

There are two types of Observation

Participant observation: - The researcher is involved in the day to day life


of people while collecting information without them knowing that he/she
is collecting data.

Non-participant observation: - Researcher collects data without


participating in the activities of the people. The people are aware that
he/she is collecting data about them.

d. Document analysis guide

Description

Documents analysis guides:

Assists the researcher collect secondary data during document review e.g.
review of Labour Market Information documents.
Page 26 of 104
Its a vital tool in historical research where past documents are analyzed
to unveil some historical facts, understand the circumstances that led to
the development of certain theories etc.

4.5 Guidelines for developing TNA tools

In Developing Items in the TNA tool the researcher shall

o Avoid double questions e.g. Are you an effective and efficient worker?

o Not use leading or biased questions e.g. Dont you agree that A level
education should be abolished?

o Not use presuming questions e.g. How do you like teaching in a remote
part of the country?

o Avoid or rephrase sensitive or threatening questions. E.g. Have you


ever handled stolen property?

4.6 Types of items in a TNA tool

TNA tool shall consist of the following items:

1. Demographic Items

These are questions or statements which seek the background information


about the respondent. e.g.

Gender: Male ( ) Female ( )

Age: 20 to 25 ( ) 26 to 30 ( )

Education level: ____________________

Marital status :_____________________

Occupation: _____________________

Religious affiliation: ___________________

2. Information Items

Page 27 of 104
These questions seek to find out the respondents knowledge of an
area of concern. E.g.

Which year was Kenya Institute of Education established?


________________________

What is the number of students enrolled in the ICT department?


__________________

How many students attended Industrial Attachment during 2014/ 2015


academic year? ________

3. Self-Perception Items

These are items that try to determine the respondents evaluation of his/her
behavior in relations to others and also his/her evaluation of others. e.g.

How do you rate your ability to handle stress?

_________________________________________

Are all teachers in your institute conversant with competency


assessment methods?

Yes ( ) No ( ) I do not know ( )

4. Attitudinal Items

These types of items solicit information about respondents attitudes,


beliefs, feelings and perceptions related to an area of research.

Also known as Likert Scale items in which the respondent selects the
most appropriate response from a weighted scale. e.g.

Please rate the extent to which you Strongly (SA), Agree (A), Disagree(D), or
Strongly Disagree(SD) with the statements.

4.7 Questionnaire format

The questionnaire may take the following format:

Contact Address of the researcher

Category of respondents (right tool to the right respondents)

Page 28 of 104
Title of the Survey e.g. National Training Needs Assessment for the
development of a CBET curriculum for

Purpose

The purpose of this study is to determine the existence of competency gaps


that can be addressed through training.

Instructions

Place a tick in the bracket ( ) in front of the most appropriate


response.

Where explanation is required use the space provided.

4.7.1 Organization

The questionnaire should be organized in sections:

Section A: Background information

Section B: Information questions

Section C: Attitude questions

Section D: General questions

Page 29 of 104
5.0 JOB ANALYSIS

5.1 What is a Job Analysis

Job Analysis is a systematic exploration, study and recording the


responsibilities, duties, skills, accountabilities, work environment and ability
requirements of a specific job. It also involves determining the relative
importance of the duties, responsibilities and physical and
emotional skills for a given job. All these factors identify what a job
demands and what an employee must possess to perform a job productively.

5.2 What does job analysis involve?

The process of job analysis involves in-depth investigation in order to control


the output, i.e., get the job performed successfully. The process helps in
finding out what a particular department requires and what a prospective
worker needs to deliver. It also helps in determining particulars about a job
including job title, job location, job summary, duties involved, working
conditions, possible hazards and machines, tools, equipment and materials
to be used by the existing or potential employee.

However, the process is not limited to determination of these factors only. It


also extends to finding out the necessary human qualifications to perform
the job. These include establishing the levels of education, experience,
judgment, training, initiative, leadership skills, physical skills, communication
skills, responsibility, accountability, emotional characteristics and unusual
sensory demands. These factors change according to the type, seniority
level, industry and risk involved in a particular job.

5.3 Importance of job analysis

The details collected by conducting job analysis play an important


role in controlling the output of the particular job. Determining the
success of job depends on the unbiased, proper and thorough job analysis. It
also helps in recruiting the right people for a particular job. The main
purpose of conducting this whole process is to create and establish a perfect
fit between the job and the employee.

Job analysis also helps HR managers in deciding the compensation package


and additional perks and incentives for a particular job position. It effectively
contributes in assessing the training needs and performance of the existing
Page 30 of 104
employees. The process forms the basis to design and establish the
strategies and policies to fulfill organizational goals and objectives.

However, analysis of a particular job does not guarantee that the managers
or organization would get the desired output. Actually collecting and
recording information for a specific job involves several complications. If the
job information is not accurate and checked from time to time, an employee
will not be able to perform his duty well. Until and unless he is not aware of
what he is supposed to do or what is expected of him, chances are that the
time and energy spent on a particular job analysis is a sheer wastage of
human resources. Therefore, proper care should be taken while conducting
job analysis.

A thorough and unbiased investigation or study of a specific job is good for


both the managers and the employees. The managers get to know whom to
hire and why. They can fill a place with the right person. On the other hand,
existing or potential employee gets to know what and how he is supposed to
perform the job and what is the desired output. Job analysis creates a right fit
between the job and the employee.

5.4 Purpose of job analysis

Job Analysis plays an important role in recruitment and selection, job


evaluation, job designing, deciding compensation and benefits packages,
performance appraisal, analyzing training and development needs, assessing
the worth of a job and increasing personnel as well as organizational
productivity.

Page 31 of 104
Recruitment and Selection: Job Analysis helps in determining what
kind of person is required to perform a particular job. It points out the
educational qualifications, level of experience and technical, physical,
emotional and personal skills required to carry out a job in desired
fashion. The objective is to fit a right person at a right place.
Performance Analysis: Job analysis is done to check if goals and
objectives of a particular job are met or not. It helps in deciding the
performance standards, evaluation criteria and individuals output. On
this basis, the overall performance of an employee is measured and he
or she is appraised accordingly.

Training and Development: Job Analysis can be used to assess the


training and development needs of employees. The difference between
the expected and actual output determines the level of training that
need to be imparted to employees. It also helps in deciding the training
content, tools and equipments to be used to conduct training and
methods of training.

Compensation Management: Of course, job analysis plays a vital


role in deciding the pay packages and extra perks and benefits and
fixed and variable incentives of employees. After all, the pay package
depends on the position, job title and duties and responsibilities
involved in a job. The process guides HR managers in deciding the
worth of an employee for a particular job opening.

Page 32 of 104
Job Designing and Redesigning: The main purpose of job analysis is
to streamline the human efforts and get the best possible output. It
helps in designing, redesigning, enriching, evaluating and also cutting
back and adding the extra responsibilities in a particular job. This is
done to enhance the employee satisfaction while increasing the human
output.

Therefore, job analysis is one of the most important functions of an HR


manager or department. This helps in fitting the right kind of talent at the
right place and at th

Therefore, job analysis is one of the most important functions of an HR


manager or department. This helps in fitting the right kind of talent at the
right place and at the right time.

5.5 Job analysis methods


Most Common Methods of Job Analysis

Observation Method: A job analyst observes an employee and


records all his performed and non-performed task, fulfilled and un-
fulfilled responsibilities and duties, methods, ways and skills used by
him or her to perform various duties and his or her mental or emotional
ability to handle challenges and risks. However, it seems one of the
easiest methods to analyze a specific job but truth is that it is the most
difficult one. Why? Lets Discover.

It is due to the fact that every person has his own way of observing
things. Different people think different and interpret the findings in
different ways. Therefore, the process may involve personal biasness
or likes and dislikes and may not produce genuine results. This error
can be avoided by proper training of job analyst or whoever will be
conducting the job analysis process.

This particular method includes three techniques: direct observation,


Work Methods Analysis and Critical Incident Technique. The first
method includes direct observation and recording of behaviour of an
employee in different situations. The second involves the study of time
and motion and is specially used for assembly-line or factory workers.
The third one is about identifying the work behaviours that result in
performance.
Page 33 of 104
Interview Method: In this method, an employee is interviewed so
that he or she comes up with their own working styles, problems faced
by them, use of particular skills and techniques while performing their
job and insecurities and fears about their careers.

This method helps interviewer know what exactly an employee thinks


about his or her own job and responsibilities involved in it. It involves
analysis of job by employee himself. In order to generate honest and
true feedback or collect genuine data, questions asked during the
interview should be carefully decided. And to avoid errors, it is always
good to interview more than one individual to get a pool of responses.
Then it can be generalized and used for the whole group.

Questionnaire Method: Another commonly used job analysis method


is getting the questionnaires filled from employees, their superiors and
managers. However, this method also suffers from personal biasness.
A great care should be takes while framing questions for different
grades of employees.

In order to get the true job-related info, management should effectively


communicate it to the staff that data collected will be used for their
own good. It is very important to ensure them that it wont be used
against them in anyway. If it is not done properly, it will be a sheer
wastage of time, money and human resources.

Page 34 of 104
These are some of the most common methods of job analysis. However,
there are several other specialized methods including task inventory, job
element method, competency profiling, technical conference, threshold traits
analysis system and a combination of these methods. While choosing a
method, HR managers need to consider time, cost and human efforts
included in conducting the process.

5.6 What to Collect during Job Analysis

Gathering job-related information involves lots of efforts and time. The


process may become cumbersome if the main objective of it is not known.
Any information can be gathered and recorded but may be hazardous for
health and finances of an organization if it is not known what is required and
why.

Before starting to conduct a job analysis process, it is very necessary to


decide what type of content or information is to be collected and why. The
purpose of this process may range from uncovering hidden dangers to the
organization or creating a right job-person fit, establishing effective hiring
practices, analyzing training needs, evaluating a job, analyzing the
performance of an employee, setting organizational standards and so on.
Each one of these objectives requires different type of information or
content.

While gathering job-related content, a job analyst or the dedicated person


should know the purpose of the action and try to collect data as accurate as
possible. Though the data collected is later on divided in to two sets - job
description and job specification but the information falls in three different
categories during the process of analyzing a specific job - job content, job
context and job requirements.

a. Job Content
b. Job Context

c. Job Requirements

a. Job Content: It contains information about various job activities included


in a specific job. It is a detailed account of actions which an employee
needs to perform during his tenure. The following information needs to be
collected by a job analyst:

Page 35 of 104
Duties of an employee

What actually an employee does

Machines, tools and equipment to be used while performing a


specific job

Additional tasks involved in a job

Desired output level (What is expected of an employee?)

Type of training required

Fig 1.1 Categorization of Job Analysis Information

The content depends upon the type of job in a particular division or


department. For example, job content of a factory-line worker would be
entirely different from that of a marketing executive or HR personnel.

b. Job Context: Job context refers to the situation or condition under which
an employee performs a particular job. The information collection will
include:

Working Conditions

Risks involved
Page 36 of 104
Whom to report

Who all will report to him or her

Hazards

Physical and mental demands

Judgment

Well like job content, data collected under this category are also
subject to change according to the type of job in a specific division or
department.

c. Job Requirements: These include basic but specific requirements which


make a candidate eligible for a particular job. The collected data includes:

Knowledge or basic information required to perform a job


successfully

Specific skills such as communication skills, IT skills, operational


skills, motor skills, processing skills and so on

Personal ability including aptitude, reasoning, manipulative


abilities, handling sudden and unexpected situations, problem-
solving ability, mathematical abilities and so on

Educational Qualifications including degree, diploma, certification


or license

Personal Characteristics such as ability to adapt to different


environment, endurance, willingness, work ethic, eagerness to
learn and understand things, behaviour towards colleagues,
subordinates and seniors, sense of belongingness to the
organization, etc

For different jobs, the parameters would be different. They depend upon the
type of job, designation, compensation grade and responsibilities and risks
involved in a job.

Lets discuss few of job analysis methods that are commonly used by the
organizations to investigate the demands of a specific job.

Page 37 of 104
5.7 Job Analysis Tools

Job Analysis supports all other management activities including recruitment


and selection, training and development need analysis, performance analysis
and appraisal, job evaluation, job rotation, enrichment and enlargement, a
right job-individual fit creation and regulation of entry and exit of talent in an
organization. The process is the basis of all these important management
activities, therefore, requires solid ground preparation. A properly performed
job analysis is adequate for laying strong organization foundation.

There are various tools and techniques such as O*Net model, PAQ model, FJA
model, F-JAS model and competency model that help HR managers to
develop genuine job description and job specification data. Though not very
new but these specialized tools and techniques are used by only a few of
very high profile organizations. Not very common in use but once
understood, these systematic approaches prove to be extremely useful for
measuring the worth of any job in an organization.

O*Net Model: The beauty of this model is that it helps managers or


job analysts in listing job-related data for a very large number of jobs
simultaneously. It helps in collecting and recording basic and initial
data including educational requirements, physical requirements and
mental and emotional requirements to some extent. It also links the
level of compensation and benefits, perks and advantages to be
offered to a prospective candidate for a specific job.

FJA Model: FJA stands for Functional Job Analysis and helps in
collecting and recording job-related data to a deeper extent. It is used
Page 38 of 104
to develop task-related statements. Developed by Sidney Fine and his
colleagues, the technique helps in determining the complexity of
duties and responsibilities involved in a specific job. This work-oriented
technique works on the basis of relatedness of job-data where
complexity of work is determined on a scale of various scores given to
a particular job. The lower scores represent greater difficulty.

PAQ Model: PAQ represents Position Analysis Questionnaire. This well-


known and commonly used technique is used to analyze a job by
getting the questionnaires filled by job incumbents and their superiors.
Designed by a trained and experienced job analyst, the process
involves interviewing the subject matter experts and employees and
evaluating the questionnaires on those bases.

F-JAS Model: Representing Fleishman Job Analysis System, it is a


basic and generic approach to discover common elements in different
jobs including verbal abilities, reasoning abilities, idea generation,
quantitative abilities, attentiveness, spatial abilities, visual and other
sensory abilities, manipulative abilities, reaction time, speed analysis,
flexibility, emotional characteristics, physical strength, perceptual
abilities, communication skills, memory, endurance, balance,
coordination and movement control abilities.

Competency Model: This model talks about the competencies of


employees in terms of knowledge, skills, abilities, behaviors, expertise
and performance. It also helps in understanding what a prospective
candidate requires at the time of entry in an organization at a
particular designation in a given work environment and schedule. The
model also includes some basic elements such as qualifications,
experience, education, training, certifications, licenses, legal
requirements and willingness of a candidate.

Job Scan: This technique defines the personality dynamics and


suggests an ideal job model. However, it does not discuss the
individual competencies such as intellect, experience or physical and
emotional characteristics of an individual required to perform a specific
job.

Different tools can be used in different situation. Selection of an ideal job


analysis tool depends upon job analysis needs and objectives and amount of
time and resources.

Page 39 of 104
5.8 Problems with Job Analysis

The process involves a variety of methods, tools, plans and a lot of human
effort. And where people are involved, nothing can be 100 percent
accurate. However, they may be appropriate considering various factors
including organizational requirements, time, effort and financial resources.
Since the entire job analysis processes, methods and tools are designed by
humans only, they tend to have practical issues associated with them.
Human brain suffers with some limitations, therefore, everything created,
designed or developed by humans too have some or other constraints.

The process of job analysis has lot of practical problems associated with it.
Though the process can be effective, appropriate, practical, efficient and
focused but it can be costly, time consuming and disruptive for employees at
the same time. It is because there are some typical problems that are
encountered by a job analyst while carrying out the process. Lets discuss
them and understand how the process of job analysis can be made more
effective by treating them carefully.

Lack of Management Support: The biggest problem arises when a


job analyst does not get proper support from the management. The top
management needs to communicate it to the middle level managers
and employees to enhance the output or productivity of the process. In
case of improper communication, employees may take it in a wrong

Page 40 of 104
sense and start looking out for other available options. They may have
a notion that this is being carried out to fire them or take any action
against them. In order to avoid such circumstances, top management
must effectively communicate the right message to their incumbents.
Lack of Co-operation from Employees: If we talk about collecting
authentic and accurate job-data, it is almost impossible to get real and
genuine data without the support of employees. If they are not ready
to co-operate, it is a sheer wastage of time, money and human effort
to conduct job analysis process. The need is to take the workers in
confidence and communicating that it is being done to solve their
problems only.

Inability to Identify the Need of Job Analysis: If the objectives and


needs of job analysis process are not properly identified, the whole
exercise of investigation and carrying out research is futile. Managers
must decide in advance why this process is being carried out, what its
objectives are and what is to be done with the collected and recorded
data.

Biasness of Job Analyst: A balanced and unbiased approach is a


necessity while carrying out the process of job analysis. To get real and
genuine data, a job analyst must be impartial in his or her approach. If
it cant be avoided, it is better to outsource the process or hire a
professional job analyst.

Using Single Data Source: A job analyst needs to consider more


than one sources of data in order to collect true information. Collecting
data from a single source may result in inaccuracy and it therefore,
defeats the whole purpose of conducting the job analysis process.

However, this is not the end. There may be many other problems involved in
a job analysis process such as insufficient time and resources, distortion from
incumbent, lack of proper communication, improper questionnaires and
other forms, absence of verification and review of job analysis process and
lack of reward or recognition for providing genuine and quality information.

5.9 Advantages and Disadvantages of Job Analysis

Though job analysis plays a vital role in all other human related activities but
every process that has human interventions also suffers from some
limitations. The process of job analysis also has its own constraints. So, let us
discuss the advantages and disadvantages of job analysis process at length.
Page 41 of 104
Advantages of Job Analysis

Provides First Hand Job-Related Information: The job analysis


process provides with valuable job-related data that helps managers
and job analyst the duties and responsibilities of a particular job, risks
and hazards involved in it, skills and abilities required to perform the
job and other related info.
Helps in Creating Right Job-Employee Fit: This is one of the most
crucial management activities. Filling the right person in a right job
vacancy is a test of skills, understanding and competencies of HR
managers. Job Analysis helps them understand what type of employee
will be suitable to deliver a specific job successfully.

Helps in Establishing Effective Hiring Practices: Who is to be


filled where and when? Who to target and how for a specific job
opening? Job analysis process gives answers to all these questions and
helps managers in creating, establishing and maintaining effective
hiring practices.

Guides through Performance Evaluation and Appraisal


Processes: Job Analysis helps managers evaluating the performance
of employees by comparing the standard or desired output with
delivered or actual output. On these bases, they appraise their
performances. The process helps in deciding whom to promote and
when. It also guides managers in understanding the skill gaps so that
right person can be fit at that particular place in order to get desired
output.

Page 42 of 104
Helps in Analyzing Training & Development Needs: The process
of job analysis gives answer to following questions:

Who to impart training

When to impart training

What should be the content of training

What should be the type of training: behavioral or technical

Who will conduct training

Helps in Deciding Compensation Package for a Specific Job: A


genuine and unbiased process of job analysis helps managers in
determining the appropriate compensation package and benefits and
allowances for a particular job. This is done on the basis of
responsibilities and hazards involved in a job.

Disadvantages of Job Analysis

Time Consuming: The biggest disadvantage of Job Analysis process is


that it is very time consuming. It is a major limitation especially when
jobs change frequently.
Involves Personal Biasness: If the observer or job analyst is an
employee of the same organization, the process may involve his or her
personal likes and dislikes. This is a major hindrance in collecting
genuine and accurate data.

Page 43 of 104
Source of Data is Extremely Small: Because of small sample size,
the source of collecting data is extremely small. Therefore, information
collected from few individuals needs to be standardized.

Involves Lots of Human Efforts: The process involves lots of human


efforts. As every job carries different information and there is no set
pattern, customized information is to be collected for different jobs.
The process needs to be conducted separately for collecting and
recording job-related data.

Job Analyst May Not Possess Appropriate Skills: If job analyst is


not aware of the objective of job analysis process or does not possess
appropriate skills to conduct the process, it is a sheer wastage of
companys resources. He or she needs to be trained in order to get
authentic data.

Mental Abilities Cannot be Directly Observed: Last but not the


least, mental abilities such as intellect, emotional characteristics,
knowledge, aptitude, psychic and endurance are intangible things that
cannot be observed or measured directly. People act differently in
different situations. Therefore, general standards cannot be set for
mental abilities.

Page 44 of 104
6.0 OCCUPATIONAL STANDARDS

6.1 What is National Occupational Standard?

National Occupational Standards (NOS) are statements of


the standards of performance individuals must achieve when carrying out
functions in the workplace, together with specifications of the underpinning
knowledge and understanding.

National Occupational Standards (NOS) describe best practice by bringing


together skills, knowledge and values. They are valuable tools as
benchmarks for qualifications as well as for defining roles at work, staff
recruitment, supervision and appraisal.

6.2 Why develop Standards?

Standards are developed in response to the areas identified by Labour


Market Surveys, Governments plans and projections, sector organizations
and Training Providers. Where Regional Occupational Standards (ROS) exist
these are vetted by the relevant Industry Lead Bodies and can be submitted
for approval or further vetting. Where there are no ROS, international
standards are sought and validated before approved. Where neither relevant
ROS nor international standards exist, an occupational analysis is conducted
which analyzes and documents the requirements of the occupation and the
work performed. This information is used as the basis for the development
of draft competency standards for the occupation. The draft standards
developed through this process are validated by Industry Lead Bodies and
submitted to the relevant authority for approval.

Approved standards are then printed and made available to stakeholders.


These standards are reviewed periodically by the Industry Lead Bodies to
ensure continued relevance to industry needs.

6.3 Who can use National Occupational Standards?


NOS are designed as a resource for individuals and organisations to use to
improve their capacity and capability and can be used to help define job
roles, measure staff performance and identify and develop routes for
progression and professional development. Below are some examples of how
both individuals and organisations can use NOS:
6.4 Structure of the standards

Page 45 of 104
NOS are produced as a suite of units for each occupational area. Each unit
describes an area of work with the activities normally separated out into
elements with associated performance statements. These statements are
detailed descriptions of the activities which represent effective performance
of the tasks within the unit, a range of situations or circumstances and
knowledge, the underpinning knowledge and understanding needed to
effectively carry out tasks and responsibilities within the particular job role or
function.
6.4 What is the Structure of an Occupational Standard

Occupational standards are comprised of the following components:

1. Unit Title the work activity

2. Element what has to be done (to perform the work activity)

3. Performance criteria how the work activity has to be done

4. Range statements conditions under which the work activity may be


conducted

5. Required knowledge what knowledge is needed to perform this


work activity

6. Required skills what skills are needed to perform this work activity

7. Evidence Guide

6.5 Standards (Elements of Competence) and Units of Competence


should reflect;

1. Substantial work roles which have credibility in employment

2. Standards should be meaningful, measurable and reflect the core skills of


planning and problem solving not merely conformity with rules and
procedures

3. Standards should describe outcomes the results of work activity;

4. Standards should be clear and concise and be immediately


understandable by users (trainees, candidates, assessors);

5. Standards should have a consistent format across different Units

Page 46 of 104
6.6 What are the general Principles of Occupational Standards?

Standards (Elements of Competence) and Units of Competence should reflect

1. Substantial work roles which have credibility in employment

2. Standards should be meaningful, measurable and reflect the core


skills of and problem solving not merely conformity with
rules and procedures

3. Standards should describe outcomes the results of work


activity

4. Standards should be clear and concise and be immediately


understandable by users (trainees, candidates, assessors);

5. Standards should have a consistent format across different Units

In order for standards to be approved, their format and content must meet
the criteria below:

6.7 The Title of standard

The Title of standard should:

Reflect the occupational/skill area;


Be stated as a noun where applicable

6.7.1 The Title of Units or Elements of Competence

The Title of a Unit or Element of Competence should:

Use language which is precise and consistent with the appropriate


grammatical structure

Use a clear, unambiguous active verb or verbs to describe the action


required

Represent a discrete unit of work which is complete and assessable rather


than a procedural step or operation

Describe outcomes, expectations or results of activity, rather than activities,


procedures and methods.

6.7.2 The Title of a Unit or Element of Competence should not:

Page 47 of 104
Describe knowledge, skills or understanding instead of outcomes; describe
outcomes with verbs which simply mean do; use ambiguous or secondary
verbs; place an evaluative term into the statement of competence.

6.8 Performance Criteria

Performance Criteria should:

Consist of a critical outcome or process plus an evaluative phrase or,


where justified, a description of a state which either should or should
not occur;

use evaluative terms which are appropriate (i.e. absolute, with a


tolerance or conditional/dependent) given the range of variations
which might be allowed or expected in the work environment;

be phrased in simple language that is easily understood by all users;

be sufficient to cover (measure) the element.

Performance Criteria should not: describe all the outcomes as correct;


reference all performance requirements to the procedures of an organization;
offer options, alternatives or conditions; use repetitive criteria which could
form the basis of an overarching or common Unit or Element; use highly
generalized or abstract language.

6.9 Evidence Guide

All performance criteria, required knowledge and skills and range must be
assessed.

The evidence guide should include information on:

the critical performance that must be demonstrated;

appropriate methods of assessment;

appropriate context of assessment.

The critical performance that must be demonstrated; appropriate


methods of assessment; appropriate context of assessment. Resource
implications will give the basic resource requirements.

6.10 Range Statements

Page 48 of 104
Range Statements should:

Describe the tools, equipment, materials, methods and processes which are
significant to the work activity; describe significant variations which would
require different skills, methods or processes as required by industry.
reflect current and future requirements for flexibility and breadth.

6.10.1 Range Statements should not:

List variations which do not really require different skills or level of skill;
offer options or alternatives (all the range must be assessed).

6.11 Required Knowledge and Skills

The underpinning knowledge and skills should be a clear outline of the:


facts and data; theories, methods and principles to be understood and
applied; underpinning skills required to carry out the performance criteria.

Page 49 of 104
7.0 COMPETENCY BASED ASSESSMENT

7.1 What is Competency Based Assessment?

Competency Based Assessment means the process of collecting evidence


and making judgments about whether competency has been achieved to
confirm that an individual can perform to the standard expected in the
workplace as expressed in the relevant endorsed industry/enterprise
competency standards (occupational standards) or the learning outcomes of
an accredited course.

Competency Based Assessment is about identifying competence; a basis for


certification of competency and not a set of examinations. It is a very
valuable diagnostic instrument both for the worker and the employer.

It is a system of capturing and measuring the students knowledge, skills and


attitudes (K S & A) in an area and the application of the K S & A to industry.
Also a means through which portable qualifications are achieved, where
performance is judged against nationally recognized competency standards.

7.1.1 What does being competent mean?

Being competent means that the individual has suitable or sufficient: Skills,
Knowledge, Experience & Attitude according to the specification of industry

Competence means having the skill and knowledge to correctly carry out a
task, a skill, or a function.

7.1.2 What does it mean to be Competent in TVET?

Perform at predetermined skill level


Respond and react to the unexpected situations
Fulfill the role expected in the workplace
Transfer skills and knowledge to new situations

Assessment involves collecting evidence and making judgments on whether


or not competence has been achieved.

Assessment involves:

Gathering and judging of evidence about the performance of


individuals
Page 50 of 104
Judging against the criteria for what people must know and be able to
do within the context and specified in the competency standard.

7.1.3 Occupational Standards

Standards describe the work tasks to be carried out within the


framework of a specific occupational activity as well as the
related knowledge, skills and abilities.

Standards are used to:


Define National Qualification requirements
Articulate between programmes and institutions
Establish benchmarks against which to assess
Plan and develop career paths
Develop training courses and programmes
Identify human and physical resource requirements for training

7.2 Why assess?

Assessment is a process which is critically important to many aspects of


workplace performance as it confirms an individuals competence.

Assessment may be carried out for various reasons or purposes, for example:

o To recognize current competence: Helps to determine


the level/degree of competence
o To determine language, literacy and numeracy needs
o To determine training gaps
o To establish the learners or candidates progress
o To determine the achievement of competence: It gives information
about the knowledge, skills and attitude learners have acquired.
o To gain formal recognition of achievement through a Statement of
Attainment
o To gain formal recognition towards a qualification: Outcomes are used
to determine the award of a certificate
o Determine if they can apply the knowledge they have acquired
o To meet organizational requirements for work
o To gain a licence
o To operate equipment
o For recruitment

Page 51 of 104
o For promotion or classification.

7.3 Who can assess?

Any organization may undertake assessment for their own purposes and
develop their own plan as to how this is done. However, in Technical and
Vocational Education and Training (TVET), only accredited competency based
assessment centers can assess competencies which will lead to issue of
nationally recognized qualifications.

An assessment center must meet the following standards before it is


accredited:

STANDARD NO.1: The assessment center operates under a system of


management and administration that facilitates quality competency based
assessment. There should be a clear description of authority and
responsibility relationships among and between the management, staff and
candidates.

STANDARD NO.2: The assessment center employs an adequate number of


staff on full-time and/or part-time basis with appropriate qualifications,
competence and experience to conduct quality competency based
assessment.

Assessors should:

have the necessary training and assessment competencies as


determined by the National Qualifications Authority;
have the relevant technical and vocational competencies at least to
the level being delivered or assessed;
can demonstrate current industry skills directly relevant to the
assessment being undertaken; and
continue to develop their TVET knowledge and skills as well as their
industry currency and assessor competence.

STANDARD NO.3: The assessment center has sufficient and appropriate


physical and technological resources that facilitate effective assessment of
candidates on attained competencies.

Page 52 of 104
STANDARD NO. 4: The center provides essential support services to
candidates and demonstrates commitment to address their welfare.

Assessment including Recognition of Prior Learning (RPL) should:

meet the requirements of the relevant Training Package or accredited


course
be conducted in accordance with the principles of assessment and the
rules of evidence
meet workplace and, where relevant, regulatory requirements
be systematically validated.

7.4 The principles of competency based assessment


Assessment needs to abide by the following principles and must be:
Current
Assessment should take place within a short time of learning/training.
Valid
All components of assessment must be assessed. There must be
sufficient evidence to ensure that the candidate meets the competency
specified by the current standard. The candidate must not be asked to
provide evidence for or be assessed against activities that are outside
the scope of the unit standard.
Reliable
The assessment must be able to stand up to scrutiny. That is, other
assessors should reach the same conclusion. A number of evidence-
gathering methods can be used to ensure consistency.
Flexible
There is no single approach to competency based assessment.
Evidence can be collected using different methods, at different times,
under a variety of conditions. It must be responsive to the needs of the
situation and the candidate.
Fair
Assessment must not discriminate against individuals or groups.
Different people and different situations need different assessment
methods and, where necessary, reasonable adjustments to meet
individual requirements must be made.
Safe
Page 53 of 104
All work and all assessment must comply with occupational health and
safety requirements.
A variety of learning outcomes requires different assessment approaches.

7.5 Assessment Pathways

Remember, the purpose of assessment is to collect evidence to make a


judgement about performance. So, as long as the principles of
assessment and the rules of evidence are followed, it does not really matter
what happens before the evidence is collected.

What this means is that not all candidates will need to do the same thing
before we make our judgement about their performance. This opens up the
opportunity for different assessment pathways. There are three main
pathways, with numerous variations based on the needs of individual
candidates and clients:

Pathway 1: training and assessment pathway, which combines both training


and assessment

Pathway 2: assessment only pathway, which uses only assessment when


training is not required (eg, RPL/RCC)

Pathway 3: a combination of Pathway 1 and Pathway 2

7.6 Types of assessment

Assessment can be classified in different ways:

7.6.1 Norm referenced assessment/ Criterion referenced assessment

a. Norm referenced assessment

Norm referenced assessment is intrinsically competitive. It compares


individuals with each other and ranks them according to the number of
places and opportunities available.

b. Criterion referenced assessment

Criterion referenced assessment (or standards-based assessment) is


assessment against fixed criteria or standards. These predetermined criteria
can take different forms such as:

Units of competency (from Training Packages)


Modules (from curriculum)
Page 54 of 104
Standard operating procedures
Product specifications.

Increasingly, in TVET, units of competency are the benchmarks against which


individuals are assessed.

In TVET, the term competency-based assessment is used regardless of


whether the benchmarks are competency standards or other types of
benchmarks.

7.6.2 Formative/Summative/Holistic Assessment

a. Formative Assessment

Formative assessment takes place throughout a training program. The


learner is assessed and given feedback as they learn rather than at the end
of the program.

b. Summative Assessment

Summative assessment is described as assessment conducted at


predetermined points in the learning process or at the end.

c. Holistic Assessment

Assessment of a range of skills and knowledge together is known as holistic


assessment. The methods and tools may assess a number of elements of
competence or more than one competency unit at a time. You may know this
as integrated assessment.

7.7 How assessment can be conducted?

Assessment can be conducted as follows:

1. assessment of real work


2. assessment of simulated work
3. assessment of written work
4. assessment of oral responses

How each of these types of assessment fits within our work is shown in the
following diagram:

Page 55 of 104
7.8 Assessment Context

The context of assessment refers to the different situations in which


assessment occurs. Context is about the variety of circumstances that
surround different assessments. Context includes all the factors that affect

Page 56 of 104
assessmentfrom the environment in which the assessment is to take place,
to the purpose of assessment and the candidates.

The characteristics and factors affecting assessment in your practice


environment will create an assessment context.

For example:

Where will assessment take place? on-the-job, off-the-job or a


combination?
If on-the-job, what kind of environment will it be? For example,
manufacturing or service industry?
Where is the location? For example, is it a remote location?
Who will you be assessing? For example, trainees or apprentices,
experienced workers or entry level learners?
What is the purpose of the assessment?
Will assessment be in the same place or in a variety of workplaces or
institutional settings?

Different assessment contexts will affect the choice of assessment methods,


for example:

If training and assessment occurs only in a training and assessment


institution, then demonstration of skills may have to be through
simulations rather than in real workplaces. Check the relevant
assessment guidelines in the Training Package to see if simulation is
allowed.
Assessment in the workplace may require assessment to occur outside
normal operating hours so that the workflow is not affected.
When using questioning in the workplace you may need to organize to
go to a quiet or private place away from the workflow.
Third party reporting can only be used where a candidate has access to
a third party who can evaluate the candidates performance in relation
to the benchmark.

In your practice environment, what are some workplace and legislative


requirements you will need to consider when assessments are conducted?
How will this affect the assessment methods you choose?

7.8.1 Key considerations relating to the context of assessment

Page 57 of 104
Accepted assessment systems and processes may already be in existence in
your industry or in an enterprise where you may be required to assess.

These might include:

Training and assessment policies and procedures


Recording and reporting of assessment results
Maintenance and retrieval of assessment information
Established methods and tools for assessment.

Assessment type depends on assessment context in which we are going to


assess. These include:

1. Institutional such as a training organization or college


2. On-the-job at the workplace
3. Work-placement at a workplace where training is occurring

7.8.1 Institutional Assessment Methods

Institutional contexts are those that involve training and/or assessment at some sort of training venue.
Some of the types of assessment that are suited to the institutional context are shown below.

Method Purpose Tool

Observation of Assess process application Checklist


performance in a Assess practical skills Video camera
simulated situation such Assess skills in producing a Peer report
as workshop, classroom, product Supervisor report
role play Assess underpinning skills Self-evaluation

Oral questioning Assess underpinning Interview one-


knowledge on-one
Assess knowledge skills Group interview

Projects Assess practical skills Finished product


Assess underpinning Typing speed
knowledge test
Oral questions

Case studies Assess underpinning Scenarios


knowledge Written

Page 58 of 104
Method Purpose Tool

Assess problem-solving skills questions

Student presentations Assess underpinning Observation


knowledge skills Written report
Assess presentation skills Verbal feedback

Written assessments Assess knowledge skills Worksheets


Multiple choice
Written short
answers

Assignments Online Assess knowledge skills Essay


Assess knowledge Written short
Assess practical skills answers
Practical tasks
Application
package
Questions on
web page
Chat forums

Teacher/facilitator report Assess knowledge skills Verbal report


Assess practical skills Written report

7.8.2 On-the-job Assessment Methods

On-the-job contexts are those that involve training and/or assessment at


work.
Some of the types of assessment that are suited to the on-the-job context are
shown below.

Method Purpose Tool

Supervisor report Assess practical skills Verbal report


Journal entry
Performance
appraisal

On-the-job assessment Assess practical skills Finished


Assess key competencies
Page 59 of 104
Method Purpose Tool

product
Checklist
Mentor report
Self-evaluation
Team-leader
report

Observation of overall Assess application of process Observation


performance skills checklist
Assess application of Mentor report
knowledge skills Team-leader
Assess application of key report
competencies Demonstration

Observing and recording Assess all components of Log book


satisfactory performance competency Journal
of tasks over a period of Team-leader
time report
Performance
appraisal

Questions Assess knowledge Questionnaire


Interview

Team-based projects Assess key competenciesPractical task


Assess practical skillsEmail list
Assess components ofWritten analysis
competency Verbal
presentation

Work-based projects Assess process skills Log books


Assess product skills Journals
Assess application of key Checklists
competencies

7.8.3 Work Placement Assessment Methods

Work placement contexts are those that involve training and/or


assessment in a workplace where the candidate does not normally work it
is a situation that is organized to allow the candidate to gain practical
Page 60 of 104
experience. Some of the types of assessment that are suited to the work
placement context are shown below.

Method Purpose Tool

Supervisor report Assess practical Written report


skills Verbal report
Assess Diary entries
application of
key
competencies

Self-assessment by learner Assess and Observation


reinforce checklist
knowledge skills Self-assessment
Assess and report
reinforce
practical skills
Assess and
reinforce
application of
key
competencies

On-the-job assessment by Assess Observation


trainer/facilitator underpinning checklist
knowledge Oral questioning
Assess Written
application of questions
practical skills
Assess
application of
key
competencies
Assess
components of
competency

Assessment should reflect the real world

7.9 Characteristics of competency based assessment


Page 61 of 104
The defining characteristics of competency based assessment include that it
is:

Standards/criterion-based
Evidence-based
Work-focused/participatory
Criterion-referenced

a. Standards/criterion-based

Fixed standards or criteria are a set of predetermined benchmarks.


Benchmarks for assessment may be:

The requirements of the national assessment guidelines of the relevant


Training Package/s
The guidelines for qualifications and/or units of competency
The performance criteria or evidence requirements of learning
strategies, plans, programs and tools
Any requirements of Occupational Health and Safety legislation, codes
of practice, standards and guidelines
Assessment requirements of the National Reporting System
Organizational requirements or product specifications.

b. Evidence-based

Competency-based assessment is the process of gathering sufficient


evidence to make a judgment about whether the standards specified have
been met. This evidence is mostly demonstrated or produced by the
candidate, and some additional evidence can be obtained from third parties.

c. Work-focused/Participatory

This means that the candidate helps to determine the process of


assessment. Judgment of competence can involve a range of assessment
activities. Assessors and candidates can negotiate the form that the
assessment activities will take. Candidates can be invited to suggest relevant
workplace activities and tasks which could produce evidence for assessment.
So, when you are planning and organizing assessment, consider the ways in
which you can allow the candidate to be an active, rather than a passive,
participant.

In competency-based assessment, the assessor must:

Page 62 of 104
Know about assessment
Have a vocational skill to assess in (or work with someone who has
expertise in the area being assessed)
Work closely with the candidate.

In competency-based assessment, the candidates must be informed of:

The criteria against which they are being assessed


The assessment process or steps
Their ability to take an active part in the process.

7.10 Planning and organizing an assessment

This manual focuses on planning and organizing an assessment in a


competency-based or standards-based situation.

You will be planning and organizing assessment on behalf of:

Candidates (individuals being assessed)


Assessors (those conducting the assessment)
Other people involved in the assessment processingfor example,
administrative staff, coordinators, or supervisors.

In a competency-based system, a candidate can be assessed through two


pathways:

Assessment through training


Assessment only

Assessment through training

This is where the candidate needs to learn the skills and knowledge first, and
the assessment is conducted:

During the course of training at different intervals (formative


assessment)
During, and on completion of training either on- or off-the-job
(summative assessment).

Assessment only

This is where skills and knowledge have already been gained, and the
candidate is ready to be assessed against the relevant criteria/benchmarks
without needing to go through a training program. This assessment only
pathway is called many different things:

Page 63 of 104
Recognition of Prior Learning (RPL)
Recognition of Current Competency (RCC)
Skills recognition
Recognition.

The four principles of assessment are crucial to effective assessment in


vocational education and training. It is critical that the assessment situations
you plan and organize reflect these principles.

When you plan and organize an assessment, make sure that:

The evidence will prove that the individual has the required skills and
knowledge as specified in the relevant unit of competency (Validity)
Other assessors would make the same decision (Reliability)
Assessments can be either on- or-off-the job, and at a mutually
convenient time and situation (Flexibility)
The assessor objectively considers all evidence, is open and
transparent about all assessment decisions, and takes into account
relevant characteristics and needs of the candidate (Fairness).

7.10.1 An assessment plan

An assessment plan is a document developed in conjunction with other


relevant key stakeholders. In preparing an assessment plan, you should
document key steps and actions to be taken and plan for risks or
contingencies. The assessment plan ensures that all relevant stakeholders
are aware of what will happen, as well as when, where and how it will
happen.

The assessment plan should include:

The unit/s of competency and elements or other benchmarks to be


assessedyou will need to be familiar with these
The purpose/s of assessmentthis needs to be identified so that the
stakeholders are well aware of the reason for the assessment
A profile of the target groupcharacteristics and needs of the
candidates
Others involved in the assessment processteachers/assessors,
administrative staff
How the assessment will occurthat is, the assessment methods and
tools to be used. It can include a description of the method and
examples of tools to be used for the assessment

Page 64 of 104
When and where the assessment will occurthis includes details of
any due dates for submission of evidence, or dates and times of when
the assessment will occur and the proposed location of the assessment
What resources or special arrangements are requiredthis outlines
what is needed to carry out the assessment, given the special needs of
candidates, organizational requirements, or other legislative or OHS
considerations
Context for assessmentthis outlines the details of the environment in
which the assessment will take place and any changes which need to
be made as a result. For example, will it be on-the-job, off-the-job or a
combination of both? Or will the assessment be contextualized to the
work setting?
Instructions for the candidatethis outlines information to be given to
the candidate, related to the assessment exercise at hand.

7.11 Evidence requirements

Evidence requirements in a competency-based system, evidence is proof


that supports the candidates claim of competency. It is information gathered
which, when matched against the criteria, indicates that the criteria or
benchmarks have been met.

Evidence can come in a variety of forms. For example:

Demonstration of real work


Demonstration in a simulated environment
Contents of a portfolio
Role-play
Video recordings of a performance
Project
Products made
Responses to a case study
Processes used (documented)
Answers to questions
Procedures completed
Reports from third parties.

Evidence can be gathered from a range of sources and should be linked to


the candidates current or future workplace application of the competency. A
variety of methods and tools can be used to collect evidence.

Page 65 of 104
7.12 Rules of evidence

Rules of evidence have been identified to ensure that assessment produces


evidence which is valid, sufficient, current, and authentic.

The assessment methods and tools you choose in the planning phase must
be able to collect evidence that meets each of these rules of evidence.

7.12.1 Validity

Assessment methods chosen must ensure that the evidence collected covers
all the requirements in the benchmark or criteria. If the benchmark is a unit
of competency, then the evidence must cover:

All elements
All performance criteria
The dimensions of competency
Employability Skills
Consideration of the Qualifications level
All the musts in the range statement
All the critical evidence listed in the evidence guide
All essential skills and knowledge listed in the evidence guide.

7.12.2 Sufficiency

When choosing assessment methods, ensure that the tools can collect
enough evidence to make a decision about the candidates competency.
Usually, it means collecting evidence to show competency over a period time
and in different situations. It is also important to make sure the methods and
tools will assess all aspects of competency. A good way to make sure there is
sufficient evidence is to use a combination of different assessment methods.

7.12.3 Currency

When planning an assessment, you need to determine whether the


evidence-gathering opportunities you have chosen will ensure the candidate
can perform the skills and has the knowledge. Currency is particularly
important when assessing for the purpose of Recognition. Evidence supplied
by applicants for Recognition may be a combination of historical and recent
evidence to show a total picture of current competence.

7.12.4 Authenticity

Page 66 of 104
When planning an assessment, you must be able ensure that the evidence to
be gathered is the candidates own work. Evidence-gathering activities such
as direct observations or verbal questions produce authentic evidence easily
because the assessor sees the candidates skills or hears the answers to
questions. Other assessment methods such as projects or written work may
need further authentication. This can be achieved by using more than one
method, for example a project plus a follow-up interview to clarify the project
stages, or supplementary information in the form of a third party report from
managers or supervisors.

It may not always be appropriate or feasible to directly observe or directly


question the candidate, so there will be times when this may need to be
verified by a third party such as a manager or supervisor. If documentation is
used, it must be verified. Using a range of assessment methods will assist
with establishing authenticity, through the crosschecking of evidence.

7.13 Assessment Validation

A strong validation process is the key to providing quality and consistency in


assessment.

This unit specifies the outcomes required to participate in an assessment


validation process. Validation is a process involving assessors working in
collaboration to review, compare and evaluate their assessment processes
and outcomes. This includes validating assessment methods, assessment
tools and interpreting the evidence collected to make a judgement of
competence, in relation to the same unit/s of competency.

More specifically, this unit will help you develop skills and knowledge to
enable you to:

Prepare for validation by planning for participation in validation


processes in your practice environment and gathering together the
materials you will need to evaluate during the validation process.
Contribute to validation by actively participating and ensuring that the
critical aspects of validation are addressed.
Analyse and document outcomes of validation by implementing
recommendations formulated during the review process and changing
your practices to support improvements in the quality of assessment.

Assessment validation requires competence in interpreting competency


standards as benchmarks for assessment, planning and conducting
assessments and developing assessment tools.
Page 67 of 104
7.13.1 Preparing for validation

This Learning Topic examines the reasons for carrying out validation in
assessment. It looks at the preparation you need to undertake so that you
can participate successfully in validation processes.

Assessment validation is an essential part of continuous improvement in a


training and/or assessment organization, as it gives you information about
the appropriateness and effectiveness of assessment. In the validation
process, information is collected about various aspects of assessment. This
information is collated and analyzed; improvements are identified and then a
plan is put in place to introduce these improvements. Progress is then
reviewed against this plan. Good preparation is essential in any continuous
improvement process.

7.13.2 What is assessment validation?

Assessment validation involves comparing, evaluating and reviewing


assessment processes, methods and tools and the subsequent assessment
decisions. Validation confirms the validity and reliability of judgements made
by assessors against the benchmarkfor example, a unit of competency. It
assists assessors and their organisations to ensure consistent assessment to
the standard required by the assessment benchmark. Validation can be
applied to a range of assessment activities, products and processes.

For example, it is possible to validate the processes used to inform learners


about assessment; the assessment tools used to assess against
competencies within a Training Package; or the processes used when
assessing using Recognition of Prior Learning (RPL)also known as
Recognition of Current Competence (RCC), Skills Recognition or Recognition.

7.13.3 Your role in assessment validation

If you are participating in assessment validation, then your role may include:

Preparationestablishing the purpose, focus and context of the


assessment validation.

Page 68 of 104
Obtaining, reading and interpreting materialsthese might be
provided directly to you prior to the assessment validation or you
might have to locate them yourself. You will probably be familiar with
some of these materials.
Submitting your materialsyou may be required to submit
examples of assessment materials you use with candidates, such as
assessment tools including instructions, assessment checklists,
questions and case studies.
Participating in assessment validation activitiesdepending on
the type of approach used, you may be part of an assessment panel,
team assessment or any other approach used. You will need to be
prepared to actively contribute to validation sessions using appropriate
communication skills.
Discussing validation findingsand suggesting recommendations
to improve the quality of assessment.

7.13.4Assessment validation

Validation is part of an organisations quality processes.

The accredited assessment centers must validate its assessment strategies


by:

Reviewing, comparing and evaluating the assessment process, tools


and evidence contributing to judgments made by a range of assessors
against the same competency standards*, at least annually; and
Documenting any action taken to improve the quality and consistency
of assessment

These may be internal processes with stakeholder involvement or


external validations with other providers and/or stakeholders.

7.13.5 Why validate assessment?

For any organisation using assessment, validation will ensure that the
assessment processes, methods, tools and decisions are valid and reliable.

Validation assists organisations to ensure that a quality assessment service


is offered. It is also a mechanism that encourages assessors to learn from

Page 69 of 104
one another. It helps to raise the confidence of assessors and clients in the
assessment processes used, and leads to greater accountability.

Assessment validation is a formalised process open to scrutiny. Tea room


conversations about assessment, useful as they are, are not formalised
validation activities. Assessment validation needs to be:

Planned
Targeted at a specific audience
Documented
Focused on identified areas such as assessment methods and tools.

7.13.6 Purposes of validation

The purposes of validation are to:

Demonstrate compliance with the AQTF standards for RTOs


Provide evidence for external and/or internal audits
Improve assessment practices Evaluate the quality of assessment
tools
Provide professional development for assessors
Increase assessor and facilitator confidence
Determine whether different assessors/facilitators using the same tools
collect the same types and levels of evidence
Determine whether different assessors and facilitators interpret the
same evidence similarly
Determine whether assessment decisions reflect the rules of evidence.
In addition, the assessment validation process should highlight areas
that need further attention.

Depending on the focus, assessment validation should identify whether:

Assessment policies and procedures are effective and are being


followed
Candidates are receiving the kind of information they need about
assessment
Assessment resources are properly designed Assessors are assessing
consistently.

Validation could evaluate whether the assessment benchmarksfor


example, competency standardsare being assessed in a consistent, fair
and valid manner. Other benchmarks could include:

Page 70 of 104
The requirements of the national assessment guidelines of the relevant
Training Package
The performance criteria or evidence requirements of learning
strategies, programs or assessment plans
Requirements of OHS legislation, codes of practice, standards and
guidelines
Assessment requirements of the National Reporting System
Organizational requirements or product specifications.

This Learner Guide focuses on the validation of assessment based on units of


competency.

It is not feasible or necessary (nor possible in some situations) to validate the


assessment in all of the qualifications and units of competency on the RTOs
Scope of Registration once a year. However, the lessons learnt when
validating the assessment of one set of competency standards can be used
to inform assessment of other competency standards.

7.13.7 When does validation occur?

Validation can be carried out before, during and after assessment.

a. Before assessment

At this stage, validation concentrates on the design of the assessment tools


and the interpretation of the units of competency to be assessed. It is
important to ensure that assessors have a common understanding of the
standard to be achieved and the evidence to be collected.

b. During assessment

At this point, assessment validation concentrates on the performance of the


candidate during assessment, the process of assessment and the way the
assessor carries out assessment.

c. After assessment

At this stage, assessment validation concentrates on how effective the


assessment was, the standards of performance achieved, the validity of the
evidence collected, and the accuracy and consistency of the assessment
judgement.

7.13.8 What approaches can the validation process take?

Page 71 of 104
Have you ever been involved in discussions about what constitutes valid and
fair assessment? Validation approaches include:

Assessment panels
Validation and/or moderation meetings
Collectively developing and/or reviewing banks of assessment tools
and exemplars
Benchmarking
Field testing, trialling and piloting assessment tools
Peer review
Team assessment
Internal audit process
Client feedback mechanisms
Mentoring by more experienced assessors and facilitators
Use of an independent assessment validator to review.

Remember, as a participant in an assessment validation activity, it is


assumed that the coordinator will have already decided which approach will
be used. However, you may be asked to contribute to a decision about which
approach to use. Each approach can be used to discuss or examine different
aspects of the assessment process.

7.14 Roles and Responsibilities of Key Players in Assessment

a. Candidates

Candidates are individuals who are seeking to be assessed as competent


against a Standard(s).

Candidates responsibilities are to:

work with their Assessor(s) to identify opportunities where they can


demonstrate their competence in accordance with the relevant
Standard(s);
perform agreed tasks in the workplace in accordance with all health,
safety and environmental requirements;
gather evidence (where required) to support their claim of competence
in accordance with the relevant Standard(s).

b. Assessors

Page 72 of 104
Assessors responsibilities are to:

plan and manage the assessment process;


carry out assessments of Candidates performance against the relevant
Standard(s);
ensure that Candidates evidence is relevant, valid, authentic, current
and sufficient;
make a judgement as to the competence of the Candidates;
record assessment decisions.

Assessors must be competent and qualified to carry out the assessment


process they should meet the following minimum requirements:

be technically competent in the discipline area of the Standards they


are assessing against this could be demonstrated in a number of
ways:
they are an experienced practitioner or supervisor in the same
discipline;
they have been assessed as competent for the relevant Standard(s)
and/or have achieved an equivalent qualification;
they have previously performed or supervised the activities defined in
the Standard(s) and can demonstrate that they have maintained their
technical expertise and knowledge of current processes and practices;
hold a recognised Assessors qualification (e.g. L&D9DI, L&D9D, A1,
A2, D32, D33, L20, or the OPITO Approved Competence Assessor
Certificate);
fully understand the requirements defined in the Standard(s) for which
they are carrying out the assessment process; comply with the
required assessment and internal verification processes and quality
procedures for the Standard(s);
comply with the requirements for recording assessment decisions for
the Standard(s);
liaise with other Assessors and Internal Verifier(s) to ensure a
consistent approach to assessment.
Assessors should maintain the currency of their skills:
Assessor should participate in regular updates / training / Continuous
Professional Development (CPD) activities typically on a minimum of
an annual basis;
Assessors who have not carried out any assessments for a period of 2
year or more should undertake refresher training before carrying out
any assessment activities. This refresher training should ensure that
they are fully conversant with the criteria defined in the Standards they
Page 73 of 104
will be assessing against and, if relevant, the current version of their
Assessor qualification.

Additional notes on the requirements Assessors: the following are examples


of individual disciplines for which Assessors would be required to
demonstrate technical competence: offshore deck operations; processing
operations: hydrocarbons; well services (coiled tubing); well services
(mechanical wireline); well services (providing fluids); well services
(providing nitrogen); maintenance (electrical); maintenance (mechanical);
maintenance (instrument & control); should an Assessor also act as an
Internal Verifier, the Assessor cannot take any part in the verification of the
Candidates that they have assessed this must be done by a different
Internal Verifier; Assessors who are in training can undertake assessments
but all assessments against Standard(s) should to be reviewed by a qualified
Assessor and/or an Internal Verifier. Assessors in training would normally be
expected to complete their own Assessor qualification within a year of their
involvement in the assessment process.

c. Internal Verifiers

Internal Verifiers main responsibilities are to:

ensure the quality and consistency of assessment decisions made by


the Assessors;
ensure that the assessment processes comply with required quality
assurance systems;
provide feedback to Assessors on the judgements they have made of
Candidates competence.

Internal Verifiers must be competent and qualified to carry out the


verification process they should meet the following minimum requirements:

have sufficient occupational expertise in the broad discipline area


covered by the relevant Standard(s) to permit valid judgements about
assessments and appeal decisions typically this would be
demonstrated by the Internal Verifier having worked at either
operational or supervisory level in the broad discipline area;

Page 74 of 104
hold a recognised Internal Verifiers qualification (e.g. L&D11, V1, D34
or the OPITO Approved Internal Verifier Certificate);
fully understand the content of, and the assessment requirements for,
the Standard(s) for which they have responsibility for verifying;
sample evidence across all Assessors and Candidates;
provide feedback, advice and support to Assessors;
comply with the internal verification processes and quality procedures
for the Standard(s);
maintain records of internal verification activities for the Standard(s);
conduct and/or participate in standardization activities to ensure a
consistent approach to assessment;
participate in, and support, internal quality systems and ensure that
any corrective actions and recommendations required following
internal audits are carried out in a timely manner.

The Internal Verifier may also carry out the following additional activities, in
accordance with their organisations and, where relevant, the Awarding
Body/Organisations quality systems: implement an appeals procedure to
settle any disputes between Candidates and Assessors; facilitate, or
contribute to, the induction, training and development of Assessors;
participate in, and support, external quality audits and ensure that any
corrective actions and recommendations required following the audits are
carried out in a timely manner.

Internal Verifiers should maintain the currency of their skills: Internal


Verifiers should participate in regular updates / training / Continuous
Professional Development (CPD) activities typically on a minimum of an
annual basis; Internal Verifiers who have not carried out any verification for
a period of 2 year or more should undertake refresher training before
carrying out any internal verification activities. This refresher training should
ensure that they are fully conversant with the Standard(s) they will be
internally verifying and, if relevant, the current version of their Internal
Verifier qualification.

d. ExpertWitnesses Where it is not possible or practical to have a qualified


Assessor in the same location as a Candidate, an Expert Witness could be
used to support the assessment process by carrying out on-the-job
observations.

Page 75 of 104
Expert Witnesses should meet the following minimum requirements: be
discipline experts with typically a minimum of 2 years relevant experience in
the discipline area; participate in a briefing session to ensure that they are
familiar with the Standard(s) being assessed and that they understand their
role.

An Expert Witness would be required to:

agree a time with the Candidate for the observation to take place and
advise the Assessor, where possible;

observe the Candidate carrying out normal work tasks/activities;

record details of each task/activity observed and confirm its completion


according to the required Standard(s);

authenticate any supporting documentation/job paperwork;

comment on the Candidates technical ability, knowledge of


equipment, team work, safe working practices, etc;

make a recommendation to the Assessor on the Candidates ability to


carry out the task/activity.

Where on-the-job observations are carried out by an Expert Witness, a


qualified Assessor would continue to be responsible for all other assessment
activities, including assessment planning, providing feedback, review of
product evidence and questioning.

The Assessor would review all the evidence provided by the Candidate,
including the observation by the Expert Witness, and make a judgement on
the competence of the Candidate. It would be the responsibility of the
Assessor to make sure that any testimony from an Expert Witness is reliable
and technically valid.

Page 76 of 104
8.0 DESIGNING ASSESSMENT TOOLS FOR QUALITY OUTCOMES IN
TVET

8.1 What is an assessment tool?

Assessment tools are materials that enable you to collect evidence using
your chosen assessment method.

Assessment tools are the instruments and procedures used to gather and
interpret evidence of competence:

The instrument is the activity or specific questions used to assess


competence by the assessment method selected. An assessment
instrument may be supported by a profile of acceptable performance
and the decision-making rules or guidelines to be used by assessors
Procedures are the information or instructions given to the candidate
and the assessor about how the assessment is to be conducted and
recorded.

8.2 The principles of assessment

When developing assessment tools, you need to ensure that the principles of
assessment are met. This is not only good practice but also a requirement of
the National Qualification Framework. The assessment principles require that
assessment is valid, reliable, flexible and fair.

Validity refers to the extent to which the interpretation and use of


an assessment outcome can be supported by evidence. An
assessment is valid if the assessment methods and materials reflect
the elements, performance criteria and critical aspects of evidence
in the evidence guide of the unit(s) of competency, and if the
assessment outcome is fully supported by the evidence gathered.
Reliability refers to the degree of consistency and accuracy of the
assessment outcomes. That is, the extent to which the assessment
will provide similar outcomes for candidates with equal competence
at different times or places, regardless of the assessor conducting
the assessment.
Flexibility refers to the opportunity for a candidate to negotiate
certain aspects of their assessment (for example, timing) with their
assessor. All candidates should be fully informed (for example,
through an Assessment Plan) of the purpose of assessment, the

Page 77 of 104
assessment criteria, methods and tools used, and the context and
timing of the assessment.
Fair assessment does not disadvantage particular candidates or
groups of candidates. This may mean that assessment methods are
adjusted for particular candidates (such as people with disabilities
or cultural differences) to ensure that the method does not
disadvantage them because of their situation. An assessment
should not place unnecessary demands on candidates that may
prevent a candidate from demonstrating competence (for example,
an assessment should not demand a higher level of English
language or literacy than that which is required to perform to the
workplace standard outlined in the competencies being assessed).

8.3 The rules of evidence

Well-designed assessment tools will help to ensure that the evidence


collected is:

Valid: there is a clear relationship between the evidence


requirements of the unit of competency and the evidence on which
the assessment judgment is made
Sufficient: the performance criteria and evidence guide are
addressed; competency over a period of time is demonstrated; all
dimensions of competency are addressed; competency in different
contexts is demonstrated
Current: the evidence demonstrates the candidates current
knowledge and skills
Authentic: it can be verified that the evidence is the candidates
own work.

Assessment strategies and tools need to be developed in consultation with


industry and should be tested on an appropriate sample of candidates.

8.4 Steps to quality assessment tools

As with the design of all products, the quality of an assessment tool will
depend heavily on the time and effort that goes into the research and
development phases of its construction, and the ongoing testing and refining
of prototypes.

There are four simple steps in the design process:

Page 78 of 104
Step One Familiarize yourself with the mandatory requirements of the
assessment task/s.

Step Two Use your understanding of the specified competencies to choose


appropriate assessment method/s.

Step Three Get down to business and devise the assessment tool/s.

Step Four Trial and refine your tools, to help you maximize confidence that
the tool/s can be used flexibly and assist you to make valid, reliable and fair
judgments.

In summary, the following four-step process will assist you to design


assessment tools that produce quality outcomes:

Step One: Clarify the Evidence Requirements


PLAN
Step Two: Choose the most appropriate
8.4.1 Step One: Clarify the evidence requirements
assessment methods
a. Picturing Step
ACT competence
Three: Design and develop the
assessment
This involves testing your tools
own understanding of the requirements of a
unit/units of competency by visualizing a competent person at work.
REFLECT Step Four: Trial and refine the tools
When you are clear about the tasks that such a person will perform and
manage, what contingencies might arise, and in what contexts they are likely
to apply their skills, you are ready to design a training program and select an
appropriate assessment methodology. Your picture may be recorded as a
competency profile, written in accessible language that is familiar to a
candidate and/or workplace.

Once your competency profile is developed, and the sum total of the
activities that are undertaken by a person doing that job are examined, you
will be in a better position to identify opportunities to cluster units of
competency to reflect actual workplace practices.

b. Examining the benchmarks

To decide whether a person is competent you need a set of criteria or


benchmarks against which to assess their competencies. In the TVET sector,
national competency standards, the smallest of which is a unit of
competency, are the usual benchmarks against which a candidate is
assessed. Other benchmarks might include assessment criteria or evidence
Page 79 of 104
requirements from accredited courses, and organizational benchmarks such
as operating procedures, OHS standards and product specifications.

The following diagram broadly illustrates the relationship between:

benchmarks
evidence requirements
assessment methods and tools; and the
evidence produced.

Page 80 of 104
Benchmarks determine evidence requirements

Benchmarks Evidence Assessment Assessment Evidence produced


requirements (evidence (evidence
gathering) gathering)
methods tools

This is the information


This is the These are on which the
Benchmarks These are
information instruments assessment
are the the
which, when and judgement is made
standards techniques
matched instructions
against used to
against the for gathering
which a gather
benchmarks and
candidate is different
show that a interpreting
assessed types of
candidate is evidence
evidence
competent

Example Example
Example Example
Example
Real work Observation
Serve What is the checklist
observation Demonstration of
customers in learners
knowledge on
a restaurant knowledge on Third party Third party
handling the clients/
handling assessment report
personal hygiene
customers/
Instructions
hygiene? Demonstration of
for candidates
customer service and
Customer and assessors
public relations
service/ public
relations/
personal
grooming

Evidence aligned to benchmarks

c. Confirming the Evidence Requirements

Page 81 of 104
Evidence is the information that, when considered against a unit of
competency, enables you to confidently judge whether or not someone is
competent. Unlike other forms of assessment, competency based
assessment does not involve comparisons between candidates. Candidates
are assessed against standards that are clearly defined and articulated.

In order to decide what evidence you need to collect, you need to be


absolutely sure of the competency requirements by examining a number of
sources of essential information including:

the elements of the unit/units of competency, the performance criteria,


required skills and knowledge, range statement, the evidence guide,
and assessment guidelines
the dimensions of competency the task, task management,
contingency management and job/ role environment skills
the employability skills
the language, literacy and numeracy skill levels
the relevant qualification descriptor
related workplace processes, procedures and systems that assist you
to contextualize the activity you are required to assess. Be sure to
include any legislative, OHS or legal requirements that may need to be
considered when conducting assessment.

d. Identifying your candidates

The candidates that your assessment methods and tools need to cater for
might be quite broadly based or may come from a clearly defined target
group, such as an enterprise, an industry sector, occupants with a particular
job profile, or a group defined by funding body requirements. Wherever
possible, it is important that you identify your candidate group in order to
design appropriate tools.

8.4.2 Step Two: Choose your assessment methods

When choosing your assessment methods for particular unit/s of


competency, you need to use your unit of competency as your guide. With
your profile of a competent worker in mind and knowing what knowledge and
skills you require your candidates to demonstrate, you are now in a position
to determine which methods you will use to gather that evidence in
collaboration with students as well as colleagues/other assessors, and, where
practical, industry/enterprise representative

When choosing assessment method you must consider:

Page 82 of 104
How will you gather the evidence
the needs of your candidates
Who will collect the evidence
Where will you gather the evidence
When will you gather the evidence

Other practical considerations

Factors that will influence your capacity to manage the evidence


gathering process that you select might include:

the mix of students that you are working with


the size of the student cohort
the location of your students (on/off campus)
your/their access to equipment and facilities
costs and resource requirements
stress placed on students and staff by your requirements.
a. How will you gather the evidence?
Selecting an appropriate assessment/evidence gathering method is part of
the fun and challenge of professional teaching practice. It usually involves
weighing up a range of assessment methods in order to decide upon best fit
techniques, which may include those in the following table.

Assessment Methods Examples of methods

Direct observation Real work/real time activities at the


workplace
Work activities in a simulated workplace
Structured assessment activities Simulation exercises/role-plays,
Projects
Presentations
Activity sheets
Questioning Written questions
Interviews
Self-assessment
Verbal questioning
Questionnaires
Oral or written examinations (may be
applicable at higher AQF levels).

Page 83 of 104
Evidence compiled by the Portfolios
candidate Collections of work samples
Products with supporting documentation
Historical evidence
Journal/log books
Information about life experience
Review of products Products as a result of a project
Work samples/products
Third party feedback Testimonials/reports from
employers/supervisors
Evidence of training
Authenticated prior achievements
Interviews with employers, supervisors or
peers

b. Considering the needs of your candidates

Your choice of assessment methods will be influenced by a number of


factors, not least of which is meeting the needs of your candidates. Your
selection of the methods needs to take into account their circumstances
while maintaining the integrity of the unit of competency or cluster. For
example, Indigenous candidates may prefer to demonstrate rather than talk
about their knowledge. Candidates with a disability may require longer time
to complete a task. Candidates returning to study or the workforce after a
long period of unemployment may have lost confidence and find it difficult to
perform in front of others.

Your choices will also be influenced by your determination of the literacy and
numeracy skills and language proficiency of your candidates, and the skill
levels that are required in the qualification. If you are in any doubt, you may
need to draw on the expertise of specialist Language Literacy and Numeracy
(LLN) professionals to make this judgement.

To the extent that it is practical, industry representatives/employers and


candidates need to take an active part in the planning of your assessment
process. Their involvement will be of practical value to you and may increase
their ongoing commitment to and satisfaction with the quality of training and
assessment that you offer.

c. Who will collect the evidence?

Page 84 of 104
In selecting your assessment methods, you will also be making inherent
judgements about who will collect the evidence. Training Package
Assessment Guidelines may provide assistance with who can collect
evidence. It is important, whether it is the candidate, the assessor or a third-
party evidence gatherer, that the instrument and instructions of your
assessment tools clarify what is expected, and provide a clear structure for
them to follow.

d. Where will you gather the evidence?

Where you gather the evidence will be influenced by the requirements of the
Training Package or course. Most will recommend the workplace as the
preferred setting, where you will need to make sure that safety issues are
considered, and disruptions to the workplace are minimized.

If workplace assessment is not feasible or appropriate, your alternative is to


select settings and methods that enable the candidate to demonstrate their
competence to the level of performance specified. Simulation is a form of
evidence gathering that involves the candidate in completing or dealing with
a task, activity or problem in an off-the-job situation that replicates the
workplace context.

Simulations vary from recreating realistic workplace situations such as in the


use of flight simulators, through the creation of role-plays based on
workplace scenarios to the reconstruction of a business situation on a
spreadsheet.

Before considering a simulation:

checkthe requirements of the relevant Training Package and


industry view on the use of simulation
consider forming a partnership withlocal enterprises that may
provide access to a workplace or equipment, authentic
workplace documents or advice on how to create a realistic
simulated environment
review the whole qualification or units of competence to be
assessed to build in opportunities for assessing whole work tasks or
clusters of competencies.

Your obligations
Regardless of the type of evidence that you collect and examine, you are
required to meet the requirements of the AQTF. Before you move to
designing your assessment tools, take the time to consider whether the

Page 85 of 104
assessment methods you have selected enable you to meet the principles of
assessment.

Will your assessment methods result in outcomes that are:


Valid (assesses what it says it does)?
Reliable (other assessors would make the same judgement if they
reviewed the same evidence)?
Flexible (the needs of thecandidate are taken into
account in terms of the methods, the time and the place)?
Fair (the assessment allows all candidates to demonstrate their
competence)?

Having selected your assessment methods, you are now in a position to


design your assessment tools.

8.4.3 Step Three: Design and develop your assessment tools

Having clarified the evidence requirements and identified which assessment


methods you will use, it is now time to design the assessment tools.

Assessment tools contain both the instrument and the instructions or


procedures for gathering and interpreting evidence. They serve the
evidence-gatherers needs for objectivity and transparency and the
candidates need for clarity and structure.

Importantly, they should provide clear guidance and support for candidates
so that there is no ambiguity about what is required of candidates or the
basis on which assessors will make decisions. They can also, if well designed,
be used for recording and reporting purposes.

Assessment tools generally make provision for the following practical


requirements:

candidates name
assessor(s) name
date of assessment
unit/cluster title
assessment context
procedure for assessment
list of knowledge/skills to be assessed
competence achieved/outcomes of the assessment
candidate feedback
candidate signature/date
assessor signature/date

Page 86 of 104
instructions to candidate and assessor or other evidence gatherer
resource requirements of the assessment.

The tools that you design must comply with the rules of evidence, ie the tool
must facilitate the gathering of evidence that is:

valid (covers all requirements of the unit of competency)


sufficient (enables you to make a decision about competency over time
and in different situations)
current (competent performance is contemporary)
authentic (is the candidates own work).

Fit for purpose

Your assessment tool gives shape and form to your chosen assessment
method. It must, therefore, be fit for purpose, which means you need to ask
yourself which tool is needed to most effectively and efficiently support your
chosen assessment method. Particular attention should be paid to the
language, literacy and numeracy skill level of the candidates and the
requirements of the units of competency when designing your tools.

It is a requirement of the qualification that:

assessmentmaterials are consistent with the requirements


of the Training Package and the assessment strategy
candidates have timely access to current and accurate
records of their participation and progress
employers (and others), where relevant, are engaged in the
development, delivery and monitoring of training and assessment.

Standardized tools are often a useful option, as they provide a cost-effective


starting point from which assessors can develop their own tools. They are
also useful for developing common understanding amongst groups of
assessors. For new assessors, they are important confidence-building tools.

Instructions for candidates and assessors

Instructions for the candidate and the assessor are an integral part of all
assessment tools. Your instructions should respond to questions regarding
the what, when, where, how, and why of assessment processes. You might
include suggestions on reasonable adjustment to accommodate diversity
and/or advice on your recording requirements for the assessor/observer.

These instructions, which should be written in plain English, can be included


in the instrument, or in a separate document.
Page 87 of 104
Tools for direct observation

Observation is an important method for competency-based assessment,


which requires candidates to demonstrate not only what they know, but also
what they can do. Observation is a method which enables you to observe
directly what candidates can do. A number of tools can be developed to
support this assessment method including:

observation checklists
questions to accompany checklists
instructions to candidates and observers/assessors.

a. Observation checklists

An observation checklist is useful when observing performance in both real-


work situations or in simulated environments where candidates are able to
demonstrate:

vocational skills
employability skills
application of workplace procedures, including OHS procedures.

An observation checklist enables the assessor or other evidence-gatherer to


observe in a focused way, to take structured notes that can be referred to
when making the assessment decision, to provide informed feedback to
candidates, and to enhance the objectivity of the assessment decision.

You should also include clear instructions for the candidate and for the
assessor either on the checklist or in a separate document:

Candidates need to know exactly what is expected of them, and any


materials that they are required to supply.
Observers need to know exactly what they are looking for,
what resources are needed, and any other issues that need to be
taken into account. They also need to know how to use the observation
checklist.

A completed simple observation checklist is provided by way of example and


a template that can be adapted for a range of situations follows. In both
cases, instructions for assessors and candidates would also need to be
developed.

b. Performance questions to accompany checklists

Observation checklists may be supported by a list of performance questions,


which are derived from the Evidence Guides in the unit/ units of competency.
Page 88 of 104
These questions include dimensions of competency, such as contingency
management skills (what would you do if?), job/role environment skills
(what are the procedures and policies for?) and task management skills
(what are your functions and how do you manage them when you. ?).

Tools for structured assessment activities

In cases where you are constructing a structured/simulated assessment


activity to test competency you will need to develop a range of assessment
tools, which could include:

a scenario/outline of the situation


scripts for people involved in the activity/simulation
instructions for the candidate and the assessor
an observation checklist.

The scenario

The scenario can be a simple card outlining the scenario to the candidate,
any other participants, and the assessor.

Scripts for people involved in the scenario/role-play

You will need to provide scripts for any participants who help to create the
situation.

Instructions for the candidate

In addition to the information that should be provided to candidates facing


any assessment task, candidates should be advised about what is being
assessed through the dramatisation. If you plan to use any recording
instruments, such as videos or tape recorders, your instructions should
include this information, so that candidates are prepared for this so that they
can be as relaxed as possible during the assessment.

A work place simulation

The following Guidelines for workplace simulation may help the assessor
decide if this is an appropriate assessment method. This is followed by an
example, which includes the instructions/procedures for an assessor and
the assessment instrument, for anactivity that simulates a hazardous
situation.

Guidelines for work place simulation

Before making a decision to use simulation, consider:

Page 89 of 104
Training Package requirements and industry views on the use of
simulation
the benefits and limitations of using a simulation
learner characteristics and needs
available workplace opportunities
the cost of establishing and usingsimulated environments
how the simulated assessment can be combined with other forms of
evidence gathering such as logbooks, portfolios or work placements
Does simulation meet the principles of assessment in this unit or
cluster?

Preparing the assessment event

If you are assessing within a TVET training institution, consider


forming a partnership with local enterprises that may provide
access to a workplace or equipment, authentic workplace documents
or advice on how to create a realistic simulated environment
Review the whole qualification or units of competence to be assessed
to build in opportunities for assessing whole work tasks or clusters of
competencies. Where appropriate include opportunities to assess
relevant generic competencies such as teamwork, communication,
occupational health and safety and leadership
Include contingencies as part of the assessment design. For example,
candidates might be required to deal with the pressures of telephones,
time constraints and interruptions to workflow
Focus the assessment activity on processes as much as
the end product
Apply operational procedures and occupational health and
safety requirements as they would be in a real work setting
Validate methods, context and concepts with industry/workplace
representatives to ensure the accuracy of the assessment approach
Prepare an observation checklist that clearly outlines the critical
aspects.

Preparing the physical location

Consult with workplace/industry experts on what should be


included
Check real workplaces to get ideas about current practice and
ways of setting up work spaces and equipment
Where practical, alter the training environment so that it reflects a
real workplace
Page 90 of 104
use equipment and other facilities that are as close to those used
by industry as possible.

Preparing the candidate(s)

give candidates a pre-assessment briefing outlining the assessment


method, process and tools
discuss the criteria against which their performance is to be
assessed
give candidates adequate information about the role they are to
undertake and the significance of the event.

Conducting the assessment

where practical, involveindustry experts in the assessment process


and the decision making
where appropriate, video the performance of thecandidate
use a checklist of critical aspects to focus on the observation of
candidate performance
use self-assessment, peer assessment and debriefing activities
to add to the evidence gathered and help candidates
develop reflective skills.

Work-related project briefs

When assessing work-related projects such as designing a product, writing a


workplace document, solving a problem, conducting a presentation,
developing a proposal for management, you may find it useful to design a
project brief or instruction sheet. Projects can be designed for individuals or
for groups to complete.

Your project brief or instruction sheet should outline the following:

the purpose of the project which elements of competency should be


demonstrated through the project
resources the candidate might use
any particular performance expectations
who will observe the performance or assess the product
instructions for the candidate, including the timeframe and
any other pertinent information.

Tools for questioning

Page 91 of 104
Asking questions is a widely used teaching, learning and assessment
technique. Tools that you might develop to support this methodology include:

verbal questioning
written questions
interviews
self-assessment questionnaires
questionnaires
oral or written examinations (may be applicable at higher
qualification levels).

Verbal questioning

Verbal questioning is a common assessment technique, which is used in a


number of situations. It does not involve a large investment of time, and
responses to oral questions provide useful evidence of:

a candidates vocational/technical knowledge


their understanding of workplace procedures, legislation and
safety requirements.

Questioning allows you to probe to obtain clarification, confirmation or


supplementation when needed. For example, responses to what would you
do if questions are effective ways of determining whether a candidate is
able to deal effectively with contingencies (an important dimension of
competency) and to anticipate and pre-empt problems that may arise out of
the work process.

Oral questioning may also be a reasonable way to accommodate a


candidates need for consideration of their language and literacy skill levels.
Remember that the assessment should not demand higher literacy, language
or communication skills than those required for the job itself.

General guidelines for effective questioning

Keep questions short and focused on one key concept


Ensure that questions are structured
Test the questions to check that they are not ambiguous
Use `open-ended questions such as `what if...? and `why...?
questions, rather than closed questions
Keep questions clear and straight forward and ask one at a time
Use words that the candidate is able to understand
Look at the candidate when asking questions
Check to ensure that the candidate fully understands the questions

Page 92 of 104
Ask the candidate to clarify or re-phrase their answer if the assessor
does not understand the initial response
Confirm the candidates response by repeating the answer back in
his/her own words
Encourage a conversational approach with the candidate when
appropriate, to put him or her at ease
Use questions or statements as prompts for keeping focused on the
purpose of the questions and the kind of evidence being collected
Use language at a suitable level for the candidate
Listen carefully to the answers for opportunities to find unexpected
evidence
Follow up responses with further questions, if useful, to draw out more
evidence or to make links between knowledge areas
Compile a list of acceptable responses to ensure reliability of
assessment

Written questions

Most educators and candidates themselves are very familiar with written
questions in assessment situations, particularly where factual knowledge
rather than its application is being tested. Written questions can be framed
so that candidates are required to:

choose the correct answer, given multiple choices or true/false options,


or to match information with another set of given information
construct the answer themselves, as in short-answer responses or
longer reports or essays.

These two styles are sometimes used in combination to capture the benefits
or minimize the risks associated with each. Clearly, the former is more time-
friendly for the candidate and the person marking the responses, but the
questions can be difficult to construct. Questions that require a response
from the candidate are clearly easier to construct, but take a longer time to
complete and to assess.

For both categories it is useful to develop a response sheet of correct


answers. In the case of longer reports or essays, particularly where the
candidate is asked to analyze or evaluate a situation/ information, you will
need to determine the criteria that will determine the sufficiency of the
response. This will assist you to provide appropriate instructions for the
candidate.

Self-assessment tools

Page 93 of 104
Many self-assessment tools use written questions to elicit responses from the
candidate and a number of assessment tools can be adapted for this
purpose.

Tools for evidence compiled by the candidate

In some cases candidates, including those seeking recognition of their prior


learning, might compile supplementary evidence, such as portfolios,
collections of work samples, products with supporting documentation,
historical evidence, journal/log books or information about life experiences.
With each of these assessment methods, the instructions for candidates, and
the criteria for evaluation are critical.

There have been examples of poor practice. Candidates, especially those


seeking RPL, have been unsupported in their efforts to provide evidence, and
consequently have either given up or collected voluminous evidence that
failed to meet the rules of evidence.

If these methods are used, it is particularly important that the tools that
accompany them provide crystal clear instructions to both assessors and
candidates.

1. Guidelines for using journals and diaries

Students are sometimes encouraged to use reflective techniques, such as


maintaining a diary or a journal. Work journals can also be used to record
events and provide evidence of tasks, activities or achievements
accomplished by the candidate. Many people are unfamiliar with using a
journal other than for their own personal use and need clear guidance on
how it can be used as evidence of what they know, understand or can do.

In preparing guidelines for candidates, you should specify what element of


competency will be assessed, what should be included in the journal;
whether its primary purpose is as a recording mechanism, or as reflective
tool that encourages self-assessment; what form entries can take (for
example, whether or not pictures and illustrations are acceptable) and how
often entries should be made. It should also make clear to the candidate how
this information will contribute to formal assessment of the unit/units of
competency.

2. Tools to support the development and assessment of portfolios

Page 94 of 104
A portfolio is a collection of materials prepared by a candidate to
demonstrate their knowledge, skills and understanding. It has often been
used as a tool for candidates seeking RPL. New streamlined approaches to
RPL encourage assessment methods that reduce the previous reliance on
paper-based evidence and provide opportunity for candidates to gather
evidence of their competency in a range of ways that better match the
requirements of the unit/units.

Increasingly, methods that are being used to gather evidence for RPL mirror
assessment methods that are used in a training program. These include self-
assessment, interview processes and/or direct observation either on the job
in the workplace, or in a simulated environment.

If you elect to use portfolios, as part of the evidence on which you base your
assessment judgement, your guidelines for candidates need to leave no
doubt as to the intended purpose and expected composition of the portfolio.
Portfolios can be time-consuming to compile and to assess, so if you elect to
use this methodology, you need to exercise care in developing precise
guidelines. Questions of interest are likely to be:

What is a portfolio?
What should it include?
What place does reflection have in the portfolio?
What sections should it contain?
What supporting evidence should be included?
Who will have access to the portfolio, (i.e. its public use)?
What part will it play in the formal assessment of my competence?

Page 95 of 104
Tools for reviewing products

Products that are the output of participation in a project, or work samples or


products may form part of the assessment evidence. Tools that can be
developed for this method might include the product specification and a
simple checklist for assessing the product.

Tools for third party feedback

Assessment involves:

Gathering evidence; and


Making professional judgements about competency on the basis of that
evidence.

Third party evidence is evidence gathered from workplace supervisors, peers


and others to support an assessment decision. An assessor cannot always
observe a candidate over a period of time and some competencies are
difficult to assess by observation alone. Therefore gathering third party
evidence can be an essential part of the assessment process.

Guidelines for use of third party evidence Assessors should put in place
guidelines for the systematic collection of quality third party evidence. These
may be in the form of information, advice and checklists for the relevant
third parties.

Guidelines for use of third party evidence

Third party evidence is evidence gathered from workplace supervisors, peers


and others to support an assessment decision. An assessor cannot always
observe a candidate over a period of time and some competencies are
difficult to assess by observation alone. Therefore gathering third party
evidence can be an essential part of the assessment process.

Application

Assessors should put in place guidelines for the systematic collection of


quality third party evidence. These may be in the form of information, advice
and checklists for the relevant third parties. This should assist organisations
to comply with the Essential Standards for Registration.

Benefits

Page 96 of 104
It is important to support the collection of quality third party evidence as it
offers assessors a cost effective means of gathering authentic and valid
evidence in often difficult contexts. Third party reports can be used
effectively in the evidence gathering process when:

The evidence is provided by someone who is in a position to


make a valid comment on the candidate's performance, for example, a
line manager or direct supervisor
The evidence is presented in written/official form, includes the
name and contact details of the third party and can be easily verified
It is difficult to gather evidence directly, for example, if a
candidate is located in a remote area or is in a confidential job role
The authenticity and currency of evidence provided by a
candidate, for example, is confirmed as the candidate's own
work.

Considerations

There are several things to consider when preparing guidelines for gathering
third party evidence:

A decision needs to be made about the appropriate balance between


third party evidence and evidence drawn from other sources
guidelines require a validation process prior to dissemination, and
this may involve industry experts
Assessors should implement version control and archive procedures.
Qualifications and experience of the third party evidence gatherer.

8.4.4 Step Four: Trial, refine and review your tools

To ensure that your assessment resources are consistent with the


requirements of the Training Package and that they maintain their currency,
sufficiency and effectiveness, it is important that your tools are reviewed by
fellow assessors and trialled prior to use.

Inviting feedback from your peers, candidates, and industry will hopefully
confirm that the tools enable effective collection of evidence, and that the
level of difficulty is appropriate to the qualification level. Differences of
opinions provide an opportunity to discuss and resolve any ambiguities or
misunderstandings before the tools are used with candidates.

Page 97 of 104
Trialling your tools before they are used formally with candidates will enable
you to gauge the user-friendliness of the format, the appropriateness of the
literacy and numeracy levels, the clarity of the instructions and the
practicality of the format for recording assessment evidence and
judgements. It will also enable you to evaluate the suitability of the times
allowed for assessment tasks, and the tools overall cost-effectiveness
(ANTA, 2002).

During the trial, you should also assess the tools degree of adaptability. This
will be determined by its capacity to be adjusted in accordance with
variations in context and the needs of candidates, while still ensuring valid
and reliable assessment decisions.

Validation of assessment tools can be done in a number of ways, ranging


from sharing with fellow assessors, through to industry-wide validation by a
panel of assessors. Working with others often sheds fresh light that leads to
improvements.

Page 98 of 104
9.0 COMPETENCE CERTIFICATION

One of the roles of CDACC is to award certificates to successful trainees of


CBET programs i.e., Competence Certification. Certification is the
process of issuing a certificate as evidence of a learner's achievement.
CDACC certificates are only issued after successful completion of the
compulsory and optional modules of a qualification.

Testimonials and records of achievements for partial completion of a


qualification can be issued to learners or their employers on request.

CDACC can delegate certification and the issuing of testimonials and


records of achievements to accredited awarding bodies. To avoid the
illegal issuing of certificates, TVET CDACC certificates will have specific
distinguishing security features that will be made public.

9.1 Certification in CBET System

Certification in CBET system is based on Continuous and summative


assessments which are based on the modules of the curricula and
assessment of the achievement of learning outcomes and the knowledge
with a final assessment by external verifiers.

Continuous assessment will be done through logbooks whereby both the


trainee and trainer sign to indicate attainment of specific competences.

Summary of the assessments conducted will be prepared using the


prescribed format and transmitted to the TVET CDACC by the external
verifiers.

9.2 National Qualifications

National qualifications shall be available from registered training


providers, who offer accredited courses. National certificates shall not
be issued for new courses without the approval of the TVET CDACC.

National qualifications shall have internationally recognized


characteristics. They shall:

Have a clear purpose.


Be internally coherent.

Page 99 of 104
Recognize broad transferable and generic skills as well as
specialized industry and professional skills.
Be internationally credible.
Have clear indications of entry requirements wherever applicable.
Specify quality assurance requirements for training delivery and
assessment (unified and impartial).
Provide an indication of the relationship to other qualifications
wherever applicable.
Specify clearly the competencies to be achieved for the award of
the qualification.

9.3 Types of Qualifications

The council shall issue four types of qualifications:

Record of Achievement*
National Certificate
National Diploma
National Higher Diploma
9.3.1 Record of Achievement* -Records of Achievements are awarded
for those who demonstrate competence in some but not all of the units of
competence forming a National Certificate or National Diploma. Records
of Achievements are useful as an individual reference for learners and
employees who have yet to attain all the requirements to be awarded a
National qualification.

The accredited testing center shall submit successful candidates


application for certification in prescribed format to the council requesting
for the issuance of a TVET qualification.

All certificates shall bear the logo of the testing center/training provider,
the logo of the Council and the National emblem.

The qualification certificate or record of achievement shall contain the


signature of the Council Secretary and the signature of director/CEO of the
testing center.

Trainees shall only be eligible to be issued with Certificates of


competence upon successful completion of clusters of units /modules for
any given trade

Page 100 of 104


9.3.2 National certificates/qualifications shall only be issued to
graduates upon successful completion of the whole training program.

9.4 Procedure for Certification

Accredited testing center submits successful candidates application for


certification in prescribed format to the Certification Council.

The Council scrutinizes the candidates assessment information to ensure


accuracy and completeness.

Printing of Certificate is done or in cases where a candidate has


completed portions of assessment, a record of achievement is printed.

The certificate or record of achievement is signed by Certification Council


Secretary and director/CEO of the testing center.

The certificate or record of achievement is issued to the candidate by the


testing center.

9.5 Guidelines for Certification

Approved training provider shall provide accurate details of the trainee


which includes: name, National Identity Card (NIC) number, qualification
code and effective date of the Certificate and/or Record of Achievement.

NIC number is the key data for traceability of trainee qualification in


future; However Passport number is acceptable in place of NIC number for
foreign students.

National Diplomas, National certificates and records of achievement are


awarded jointly by the Certification Council and approved training
provider.

All certificates and records of achievement shall carry provision for the
logo of the training provider or accredited establishment along with the
logo of the Council and National emblem.

The certificates and records of achievement shall contain the signature of


the CDACC Council Secretary and the signature of director/CEO of the
training provider.

All data of certificate holders are stored at the Councils data base.

Page 101 of 104


The Council will issue a national TVET Qualification for all competent
candidates who have completed relevant qualifications.

Director/CEO of the training provider must acknowledge receipt of


certificate (s) by signing and returning the appropriate document to the
Council secretary.

9.6 Recognition of Prior Learning (RPL)

Recognition of Prior Learning is a process that recognizes a learners


current competencies which may have been achieved through means that
include any combination of formal or informal training and education,
work experience or general life experience regardless of where or how the
learning took place.

Recognition of uncertified learning may be combined with any formal


certification to enable assessment decisions to be made.

The national vocational qualifications system shall recognize prior


learning based on national competency standards and determine the
extent to which an individual has achieved the required competencies for
partial or total completion of a national vocational qualification.

CBEDFK Provides RPL only up to certificate levels.

A testing center that wishes to conduct RPL assessments shall apply to


the Council with sufficient evidence of staffing, policies and procedures as
well as the necessary infrastructure and resources.

9.6.1 The Importance of RPL

Many workers and other disadvantaged groups are denied opportunities


for employment, promotion and salary increase in favour of those who
have formal education and training qualifications. This results in the lack
of access and equity and inefficiencies in the workplace because of the
inability to fully recognize and utilize the existing skills, knowledge and
values possessed by these workers.

RPL provides for the formal recognition of the skills, knowledge and
values that have been developed and acquired in a range of different
ways. This is done by assessing the respective candidates against
nationally agreed standards that are registered on the TQF. The

Page 102 of 104


candidates receive credits which open up access to opportunities for
employment, promotion and further education and training.

9.6.2 Procedure for Conducting Recognition of Prior Learning


(RPL)

The RPL candidate applies by completing RPL application and


forwards to RPL Coordinator along with any supporting
evidence/documents.
RPL coordinator assigns an assessor and verifier.
RPL coordinator forwards copy of RPL Application and other
evidence to Assessor.
The assessor contacts candidate to commence assessment process.
The assessor prepares assessment plan.
The candidate provides evidence of competence.
The assessor assesses candidates evidence against unit(s)
requirements and records outcomes.
The assessor provides feedback to candidate on assessment
outcome and obtains signed acknowledgement from the candidate.
The assessor returns completed RPL Assessment record book to RPL
Coordinator.
The RPL coordinator forwards all related documents and records to
the verifier
The verifier reviews and verifies the assessment decision and
forwards the duly completed form and formats to the Certification
Council for issuing the Qualification.

9.7 Alignment of Existing Qualifications

One of the major challenges in the operationalization of the CBETF is the


alignment of existing non-TVET qualifications against the CBETF. These
could be external qualifications offered in other countries or non-TVET
qualifications offered in Kenya.

This is important in order to promote a unified TVET system and make the
similarities and differences between the new and old qualifications
transparent to learners, employers and other stakeholders.

The following procedure shall be followed to assess and evaluate


qualifications; the analysis is based on the following criteria:

Page 103 of 104


Duration of the training program undertaken by the holder of the
respective qualifications in relation to the credit system of the
CBETF
Contents of the training programs (syllabus) against the learning
outcomes specified in the relevant modules of the CBETF
qualifications
Method of training delivery
Mode of assessment used

Page 104 of 104

You might also like