Assessment Framework
Assessment Framework
Assessment Framework
CERTIFICATION COUNCIL
APRIL 2017
Page 1 of 104
1.0 REFORMS IN EDUCATION AND TRAINING
1.1 Introduction
The Ministry of Education is divided into three (3) State departments each in-
charge of one of the three sub-sectors of education and training:
The three Education and Training sub-sectors are guided by four (4) Acts of
Parliament as follows:
Page 2 of 104
The TVET reforms are expected to result in a shift from:
a. Industry
Page 3 of 104
Participate in external assessment of trainees and evaluation of the
CBET systems
b. Training providers
1.6 Conclusion
Page 5 of 104
2.0 ROLE OF CURRICULUM DEVELOPMENT, ASSESSMENT AND
CERTIFICATION COUNCIL (CDACC) AND STAKEHOLDERS IN THE CBET
SYSTEM
2.1 INTRODUCTION
a. Establishment
The TVET CDACC is established under section 44 (1) of the TVET Act, 2013 to
coordinate assessment, examination and certification of TVET.
b. Mandate
c. Vision
d. Mission
2.2 Functions of TVET CDACC as per the TVET ACT NO. 29 of 2013
(g) Promote the publication of books and other materials relevant to its
examinations
Page 6 of 104
(h) Anything incidental or conducive to the performance of any of the
preceding functions
assessments
9. Certification of competence
Page 7 of 104
3.0 OVERVIEW OF THE CBET APPROACH
3.1 Introduction
TVET system in Kenya has faced neglect for a long period of time resulting in
low financing, inequitable participation rate, mismatch between skills
demanded and skills supplied and efficiency of TVET programmes.
The Basic Education Act, 2013, University Act, 2012 and TVET Act 2013 were
all formulated to guide the three sub- sectors of education
Page 8 of 104
These Acts address issues of access, equity, transition rates, relevance,
quality and efficiency in the management of the respective sub sector.
Ensure that CBET policies and guidelines are disseminated to all key
stakeholders for implementation.
Course accreditation
Page 9 of 104
Monitoring and auditing
Page 10 of 104
3.6 Institutional Structure
3.7 CBET
Page 11 of 104
Link education and training to the skills needed by employers
Those individuals who are made redundant so that they can learn new
competences
Those who have retired but who would like to develop new skills and
competences
Here, the specific objectives are clearly stated in each of the three
learning domains and the achievement is planned and developed in
these domains independently.
Page 12 of 104
In the case of CBET, the competency statements resemble the
objective statements in appearance, but the competency statement
are cutting across the domains.
Page 13 of 104
Monitoring standards/providing verifiers
Within the NQF, there shall be TVET qualifications with the levels of
Certificates, Diploma, Higher Diploma and degree as well as short
Page 14 of 104
course/modules which can all be given and level and the learning
quantified in terms of notional learning hours.
Page 15 of 104
Page 16 of 104
3.14 Sector Skills Advisory Committees
Advise the TVETA Board and TVET CDACC on the establishment of new
vocational trades training and educational programmes or on the
abolition of existing ones.
Assess training needs and determine standards for the area of trades
the committee is covering.
Will draw up training specifications and job description for the trades to
be taught.
Page 17 of 104
Figure 5 outlines qualification structure.
Page 18 of 104
3.16 Qualification Structure
3.18 Conclusion
Page 19 of 104
4.0 TRAINING NEEDS ASSESSMENT/ ANALYSIS (TNA)
vii. project current and future employment patterns for skilled workers
Page 20 of 104
Identify in general terms, what the content of training should be (broad
picture).
TNA proposal
Shall be written in future tense because its an intent to carry out the
survey.
Is a contract document between the person carrying out the TNA and
the organization funding the survey.
Must convince the financier of the survey of the need to commit funds.
a. Back ground:
Must bring out the mismatch between the competencies needed by the
industry and the skills developed through training using Knowledge
Based Learning.
Indicate that changes in technical training policy require that all TVET
training programs be competency based.
Must indicate how the current TVET training programmes have failed to
meet the competence demands of the industry.
It clearly outlines the approach that will be used to tackle the stated
problem.
Page 21 of 104
It attempts to answer the questions: what, where, when, why and how
c. Objectives
TNA proposal shall have both the general and specific objective of the
survey.
It shall also have the survey questions which are drawn from the
specific objectives.
d. Justification
The initiator must justify why there is need to carry out the survey and
state the consequences of not carrying out the survey.
e. Methodology
f. Appendices
etc
A TNA Report is evidence that there exists a gap that needs to be addressed
through training. The TNA report shall among other things:
A TNA report shall be written in past tense and shall have the following:
Background
Problem statement
Objectives
Survey questions
Methodology used
Findings
Conclusions
Recommendations
4.3.2 Appendices
References
Page 23 of 104
4.3.3 Conclusion
The curriculum initiator shall first identifying the respondents and the
methodology for the TNA survey, design relevant data collection tools or
instruments and then carry out a Training Needs Assessment.
Questionnaires
Interview guides
Observation schedules
Document analysis guides
4.4.1 Description
Male ( ) Female ( )
a. Questionnaire
Description
A questionnaire can be
Page 24 of 104
Self administered questionnaire-the respondent fills the questionnaire in
the absence of the data collector.
Or
Interviewer administered questionnaire-the data collector fills the
questionnaire as the respondents responds.
A self-administered questionnaire may be
Description
This is a tool for primary data collection used by the interviewer during
the interview as a guide.
Is rigid in that the researcher reads the items as they are without
changing or expounding on the items.
Page 25 of 104
iii) In-depth Interview Guide
c. Observation guides
Description
Description
Assists the researcher collect secondary data during document review e.g.
review of Labour Market Information documents.
Page 26 of 104
Its a vital tool in historical research where past documents are analyzed
to unveil some historical facts, understand the circumstances that led to
the development of certain theories etc.
o Avoid double questions e.g. Are you an effective and efficient worker?
o Not use leading or biased questions e.g. Dont you agree that A level
education should be abolished?
o Not use presuming questions e.g. How do you like teaching in a remote
part of the country?
1. Demographic Items
Age: 20 to 25 ( ) 26 to 30 ( )
Occupation: _____________________
2. Information Items
Page 27 of 104
These questions seek to find out the respondents knowledge of an
area of concern. E.g.
3. Self-Perception Items
These are items that try to determine the respondents evaluation of his/her
behavior in relations to others and also his/her evaluation of others. e.g.
_________________________________________
4. Attitudinal Items
Also known as Likert Scale items in which the respondent selects the
most appropriate response from a weighted scale. e.g.
Please rate the extent to which you Strongly (SA), Agree (A), Disagree(D), or
Strongly Disagree(SD) with the statements.
Page 28 of 104
Title of the Survey e.g. National Training Needs Assessment for the
development of a CBET curriculum for
Purpose
Instructions
4.7.1 Organization
Page 29 of 104
5.0 JOB ANALYSIS
However, analysis of a particular job does not guarantee that the managers
or organization would get the desired output. Actually collecting and
recording information for a specific job involves several complications. If the
job information is not accurate and checked from time to time, an employee
will not be able to perform his duty well. Until and unless he is not aware of
what he is supposed to do or what is expected of him, chances are that the
time and energy spent on a particular job analysis is a sheer wastage of
human resources. Therefore, proper care should be taken while conducting
job analysis.
Page 31 of 104
Recruitment and Selection: Job Analysis helps in determining what
kind of person is required to perform a particular job. It points out the
educational qualifications, level of experience and technical, physical,
emotional and personal skills required to carry out a job in desired
fashion. The objective is to fit a right person at a right place.
Performance Analysis: Job analysis is done to check if goals and
objectives of a particular job are met or not. It helps in deciding the
performance standards, evaluation criteria and individuals output. On
this basis, the overall performance of an employee is measured and he
or she is appraised accordingly.
Page 32 of 104
Job Designing and Redesigning: The main purpose of job analysis is
to streamline the human efforts and get the best possible output. It
helps in designing, redesigning, enriching, evaluating and also cutting
back and adding the extra responsibilities in a particular job. This is
done to enhance the employee satisfaction while increasing the human
output.
It is due to the fact that every person has his own way of observing
things. Different people think different and interpret the findings in
different ways. Therefore, the process may involve personal biasness
or likes and dislikes and may not produce genuine results. This error
can be avoided by proper training of job analyst or whoever will be
conducting the job analysis process.
Page 34 of 104
These are some of the most common methods of job analysis. However,
there are several other specialized methods including task inventory, job
element method, competency profiling, technical conference, threshold traits
analysis system and a combination of these methods. While choosing a
method, HR managers need to consider time, cost and human efforts
included in conducting the process.
a. Job Content
b. Job Context
c. Job Requirements
Page 35 of 104
Duties of an employee
b. Job Context: Job context refers to the situation or condition under which
an employee performs a particular job. The information collection will
include:
Working Conditions
Risks involved
Page 36 of 104
Whom to report
Hazards
Judgment
Well like job content, data collected under this category are also
subject to change according to the type of job in a specific division or
department.
For different jobs, the parameters would be different. They depend upon the
type of job, designation, compensation grade and responsibilities and risks
involved in a job.
Lets discuss few of job analysis methods that are commonly used by the
organizations to investigate the demands of a specific job.
Page 37 of 104
5.7 Job Analysis Tools
There are various tools and techniques such as O*Net model, PAQ model, FJA
model, F-JAS model and competency model that help HR managers to
develop genuine job description and job specification data. Though not very
new but these specialized tools and techniques are used by only a few of
very high profile organizations. Not very common in use but once
understood, these systematic approaches prove to be extremely useful for
measuring the worth of any job in an organization.
FJA Model: FJA stands for Functional Job Analysis and helps in
collecting and recording job-related data to a deeper extent. It is used
Page 38 of 104
to develop task-related statements. Developed by Sidney Fine and his
colleagues, the technique helps in determining the complexity of
duties and responsibilities involved in a specific job. This work-oriented
technique works on the basis of relatedness of job-data where
complexity of work is determined on a scale of various scores given to
a particular job. The lower scores represent greater difficulty.
Page 39 of 104
5.8 Problems with Job Analysis
The process involves a variety of methods, tools, plans and a lot of human
effort. And where people are involved, nothing can be 100 percent
accurate. However, they may be appropriate considering various factors
including organizational requirements, time, effort and financial resources.
Since the entire job analysis processes, methods and tools are designed by
humans only, they tend to have practical issues associated with them.
Human brain suffers with some limitations, therefore, everything created,
designed or developed by humans too have some or other constraints.
The process of job analysis has lot of practical problems associated with it.
Though the process can be effective, appropriate, practical, efficient and
focused but it can be costly, time consuming and disruptive for employees at
the same time. It is because there are some typical problems that are
encountered by a job analyst while carrying out the process. Lets discuss
them and understand how the process of job analysis can be made more
effective by treating them carefully.
Page 40 of 104
sense and start looking out for other available options. They may have
a notion that this is being carried out to fire them or take any action
against them. In order to avoid such circumstances, top management
must effectively communicate the right message to their incumbents.
Lack of Co-operation from Employees: If we talk about collecting
authentic and accurate job-data, it is almost impossible to get real and
genuine data without the support of employees. If they are not ready
to co-operate, it is a sheer wastage of time, money and human effort
to conduct job analysis process. The need is to take the workers in
confidence and communicating that it is being done to solve their
problems only.
However, this is not the end. There may be many other problems involved in
a job analysis process such as insufficient time and resources, distortion from
incumbent, lack of proper communication, improper questionnaires and
other forms, absence of verification and review of job analysis process and
lack of reward or recognition for providing genuine and quality information.
Though job analysis plays a vital role in all other human related activities but
every process that has human interventions also suffers from some
limitations. The process of job analysis also has its own constraints. So, let us
discuss the advantages and disadvantages of job analysis process at length.
Page 41 of 104
Advantages of Job Analysis
Page 42 of 104
Helps in Analyzing Training & Development Needs: The process
of job analysis gives answer to following questions:
Page 43 of 104
Source of Data is Extremely Small: Because of small sample size,
the source of collecting data is extremely small. Therefore, information
collected from few individuals needs to be standardized.
Page 44 of 104
6.0 OCCUPATIONAL STANDARDS
Page 45 of 104
NOS are produced as a suite of units for each occupational area. Each unit
describes an area of work with the activities normally separated out into
elements with associated performance statements. These statements are
detailed descriptions of the activities which represent effective performance
of the tasks within the unit, a range of situations or circumstances and
knowledge, the underpinning knowledge and understanding needed to
effectively carry out tasks and responsibilities within the particular job role or
function.
6.4 What is the Structure of an Occupational Standard
6. Required skills what skills are needed to perform this work activity
7. Evidence Guide
Page 46 of 104
6.6 What are the general Principles of Occupational Standards?
In order for standards to be approved, their format and content must meet
the criteria below:
Page 47 of 104
Describe knowledge, skills or understanding instead of outcomes; describe
outcomes with verbs which simply mean do; use ambiguous or secondary
verbs; place an evaluative term into the statement of competence.
All performance criteria, required knowledge and skills and range must be
assessed.
Page 48 of 104
Range Statements should:
Describe the tools, equipment, materials, methods and processes which are
significant to the work activity; describe significant variations which would
require different skills, methods or processes as required by industry.
reflect current and future requirements for flexibility and breadth.
List variations which do not really require different skills or level of skill;
offer options or alternatives (all the range must be assessed).
Page 49 of 104
7.0 COMPETENCY BASED ASSESSMENT
Being competent means that the individual has suitable or sufficient: Skills,
Knowledge, Experience & Attitude according to the specification of industry
Competence means having the skill and knowledge to correctly carry out a
task, a skill, or a function.
Assessment involves:
Assessment may be carried out for various reasons or purposes, for example:
Page 51 of 104
o For promotion or classification.
Any organization may undertake assessment for their own purposes and
develop their own plan as to how this is done. However, in Technical and
Vocational Education and Training (TVET), only accredited competency based
assessment centers can assess competencies which will lead to issue of
nationally recognized qualifications.
Assessors should:
Page 52 of 104
STANDARD NO. 4: The center provides essential support services to
candidates and demonstrates commitment to address their welfare.
What this means is that not all candidates will need to do the same thing
before we make our judgement about their performance. This opens up the
opportunity for different assessment pathways. There are three main
pathways, with numerous variations based on the needs of individual
candidates and clients:
a. Formative Assessment
b. Summative Assessment
c. Holistic Assessment
How each of these types of assessment fits within our work is shown in the
following diagram:
Page 55 of 104
7.8 Assessment Context
Page 56 of 104
assessmentfrom the environment in which the assessment is to take place,
to the purpose of assessment and the candidates.
For example:
Page 57 of 104
Accepted assessment systems and processes may already be in existence in
your industry or in an enterprise where you may be required to assess.
Institutional contexts are those that involve training and/or assessment at some sort of training venue.
Some of the types of assessment that are suited to the institutional context are shown below.
Page 58 of 104
Method Purpose Tool
product
Checklist
Mentor report
Self-evaluation
Team-leader
report
Standards/criterion-based
Evidence-based
Work-focused/participatory
Criterion-referenced
a. Standards/criterion-based
b. Evidence-based
c. Work-focused/Participatory
Page 62 of 104
Know about assessment
Have a vocational skill to assess in (or work with someone who has
expertise in the area being assessed)
Work closely with the candidate.
This is where the candidate needs to learn the skills and knowledge first, and
the assessment is conducted:
Assessment only
This is where skills and knowledge have already been gained, and the
candidate is ready to be assessed against the relevant criteria/benchmarks
without needing to go through a training program. This assessment only
pathway is called many different things:
Page 63 of 104
Recognition of Prior Learning (RPL)
Recognition of Current Competency (RCC)
Skills recognition
Recognition.
The evidence will prove that the individual has the required skills and
knowledge as specified in the relevant unit of competency (Validity)
Other assessors would make the same decision (Reliability)
Assessments can be either on- or-off-the job, and at a mutually
convenient time and situation (Flexibility)
The assessor objectively considers all evidence, is open and
transparent about all assessment decisions, and takes into account
relevant characteristics and needs of the candidate (Fairness).
Page 64 of 104
When and where the assessment will occurthis includes details of
any due dates for submission of evidence, or dates and times of when
the assessment will occur and the proposed location of the assessment
What resources or special arrangements are requiredthis outlines
what is needed to carry out the assessment, given the special needs of
candidates, organizational requirements, or other legislative or OHS
considerations
Context for assessmentthis outlines the details of the environment in
which the assessment will take place and any changes which need to
be made as a result. For example, will it be on-the-job, off-the-job or a
combination of both? Or will the assessment be contextualized to the
work setting?
Instructions for the candidatethis outlines information to be given to
the candidate, related to the assessment exercise at hand.
Page 65 of 104
7.12 Rules of evidence
The assessment methods and tools you choose in the planning phase must
be able to collect evidence that meets each of these rules of evidence.
7.12.1 Validity
Assessment methods chosen must ensure that the evidence collected covers
all the requirements in the benchmark or criteria. If the benchmark is a unit
of competency, then the evidence must cover:
All elements
All performance criteria
The dimensions of competency
Employability Skills
Consideration of the Qualifications level
All the musts in the range statement
All the critical evidence listed in the evidence guide
All essential skills and knowledge listed in the evidence guide.
7.12.2 Sufficiency
When choosing assessment methods, ensure that the tools can collect
enough evidence to make a decision about the candidates competency.
Usually, it means collecting evidence to show competency over a period time
and in different situations. It is also important to make sure the methods and
tools will assess all aspects of competency. A good way to make sure there is
sufficient evidence is to use a combination of different assessment methods.
7.12.3 Currency
7.12.4 Authenticity
Page 66 of 104
When planning an assessment, you must be able ensure that the evidence to
be gathered is the candidates own work. Evidence-gathering activities such
as direct observations or verbal questions produce authentic evidence easily
because the assessor sees the candidates skills or hears the answers to
questions. Other assessment methods such as projects or written work may
need further authentication. This can be achieved by using more than one
method, for example a project plus a follow-up interview to clarify the project
stages, or supplementary information in the form of a third party report from
managers or supervisors.
More specifically, this unit will help you develop skills and knowledge to
enable you to:
This Learning Topic examines the reasons for carrying out validation in
assessment. It looks at the preparation you need to undertake so that you
can participate successfully in validation processes.
If you are participating in assessment validation, then your role may include:
Page 68 of 104
Obtaining, reading and interpreting materialsthese might be
provided directly to you prior to the assessment validation or you
might have to locate them yourself. You will probably be familiar with
some of these materials.
Submitting your materialsyou may be required to submit
examples of assessment materials you use with candidates, such as
assessment tools including instructions, assessment checklists,
questions and case studies.
Participating in assessment validation activitiesdepending on
the type of approach used, you may be part of an assessment panel,
team assessment or any other approach used. You will need to be
prepared to actively contribute to validation sessions using appropriate
communication skills.
Discussing validation findingsand suggesting recommendations
to improve the quality of assessment.
7.13.4Assessment validation
For any organisation using assessment, validation will ensure that the
assessment processes, methods, tools and decisions are valid and reliable.
Page 69 of 104
one another. It helps to raise the confidence of assessors and clients in the
assessment processes used, and leads to greater accountability.
Planned
Targeted at a specific audience
Documented
Focused on identified areas such as assessment methods and tools.
Page 70 of 104
The requirements of the national assessment guidelines of the relevant
Training Package
The performance criteria or evidence requirements of learning
strategies, programs or assessment plans
Requirements of OHS legislation, codes of practice, standards and
guidelines
Assessment requirements of the National Reporting System
Organizational requirements or product specifications.
a. Before assessment
b. During assessment
c. After assessment
Page 71 of 104
Have you ever been involved in discussions about what constitutes valid and
fair assessment? Validation approaches include:
Assessment panels
Validation and/or moderation meetings
Collectively developing and/or reviewing banks of assessment tools
and exemplars
Benchmarking
Field testing, trialling and piloting assessment tools
Peer review
Team assessment
Internal audit process
Client feedback mechanisms
Mentoring by more experienced assessors and facilitators
Use of an independent assessment validator to review.
a. Candidates
b. Assessors
Page 72 of 104
Assessors responsibilities are to:
c. Internal Verifiers
Page 74 of 104
hold a recognised Internal Verifiers qualification (e.g. L&D11, V1, D34
or the OPITO Approved Internal Verifier Certificate);
fully understand the content of, and the assessment requirements for,
the Standard(s) for which they have responsibility for verifying;
sample evidence across all Assessors and Candidates;
provide feedback, advice and support to Assessors;
comply with the internal verification processes and quality procedures
for the Standard(s);
maintain records of internal verification activities for the Standard(s);
conduct and/or participate in standardization activities to ensure a
consistent approach to assessment;
participate in, and support, internal quality systems and ensure that
any corrective actions and recommendations required following
internal audits are carried out in a timely manner.
The Internal Verifier may also carry out the following additional activities, in
accordance with their organisations and, where relevant, the Awarding
Body/Organisations quality systems: implement an appeals procedure to
settle any disputes between Candidates and Assessors; facilitate, or
contribute to, the induction, training and development of Assessors;
participate in, and support, external quality audits and ensure that any
corrective actions and recommendations required following the audits are
carried out in a timely manner.
Page 75 of 104
Expert Witnesses should meet the following minimum requirements: be
discipline experts with typically a minimum of 2 years relevant experience in
the discipline area; participate in a briefing session to ensure that they are
familiar with the Standard(s) being assessed and that they understand their
role.
agree a time with the Candidate for the observation to take place and
advise the Assessor, where possible;
The Assessor would review all the evidence provided by the Candidate,
including the observation by the Expert Witness, and make a judgement on
the competence of the Candidate. It would be the responsibility of the
Assessor to make sure that any testimony from an Expert Witness is reliable
and technically valid.
Page 76 of 104
8.0 DESIGNING ASSESSMENT TOOLS FOR QUALITY OUTCOMES IN
TVET
Assessment tools are materials that enable you to collect evidence using
your chosen assessment method.
Assessment tools are the instruments and procedures used to gather and
interpret evidence of competence:
When developing assessment tools, you need to ensure that the principles of
assessment are met. This is not only good practice but also a requirement of
the National Qualification Framework. The assessment principles require that
assessment is valid, reliable, flexible and fair.
Page 77 of 104
assessment criteria, methods and tools used, and the context and
timing of the assessment.
Fair assessment does not disadvantage particular candidates or
groups of candidates. This may mean that assessment methods are
adjusted for particular candidates (such as people with disabilities
or cultural differences) to ensure that the method does not
disadvantage them because of their situation. An assessment
should not place unnecessary demands on candidates that may
prevent a candidate from demonstrating competence (for example,
an assessment should not demand a higher level of English
language or literacy than that which is required to perform to the
workplace standard outlined in the competencies being assessed).
As with the design of all products, the quality of an assessment tool will
depend heavily on the time and effort that goes into the research and
development phases of its construction, and the ongoing testing and refining
of prototypes.
Page 78 of 104
Step One Familiarize yourself with the mandatory requirements of the
assessment task/s.
Step Three Get down to business and devise the assessment tool/s.
Step Four Trial and refine your tools, to help you maximize confidence that
the tool/s can be used flexibly and assist you to make valid, reliable and fair
judgments.
Once your competency profile is developed, and the sum total of the
activities that are undertaken by a person doing that job are examined, you
will be in a better position to identify opportunities to cluster units of
competency to reflect actual workplace practices.
benchmarks
evidence requirements
assessment methods and tools; and the
evidence produced.
Page 80 of 104
Benchmarks determine evidence requirements
Example Example
Example Example
Example
Real work Observation
Serve What is the checklist
observation Demonstration of
customers in learners
knowledge on
a restaurant knowledge on Third party Third party
handling the clients/
handling assessment report
personal hygiene
customers/
Instructions
hygiene? Demonstration of
for candidates
customer service and
Customer and assessors
public relations
service/ public
relations/
personal
grooming
Page 81 of 104
Evidence is the information that, when considered against a unit of
competency, enables you to confidently judge whether or not someone is
competent. Unlike other forms of assessment, competency based
assessment does not involve comparisons between candidates. Candidates
are assessed against standards that are clearly defined and articulated.
The candidates that your assessment methods and tools need to cater for
might be quite broadly based or may come from a clearly defined target
group, such as an enterprise, an industry sector, occupants with a particular
job profile, or a group defined by funding body requirements. Wherever
possible, it is important that you identify your candidate group in order to
design appropriate tools.
Page 82 of 104
How will you gather the evidence
the needs of your candidates
Who will collect the evidence
Where will you gather the evidence
When will you gather the evidence
Page 83 of 104
Evidence compiled by the Portfolios
candidate Collections of work samples
Products with supporting documentation
Historical evidence
Journal/log books
Information about life experience
Review of products Products as a result of a project
Work samples/products
Third party feedback Testimonials/reports from
employers/supervisors
Evidence of training
Authenticated prior achievements
Interviews with employers, supervisors or
peers
Your choices will also be influenced by your determination of the literacy and
numeracy skills and language proficiency of your candidates, and the skill
levels that are required in the qualification. If you are in any doubt, you may
need to draw on the expertise of specialist Language Literacy and Numeracy
(LLN) professionals to make this judgement.
Page 84 of 104
In selecting your assessment methods, you will also be making inherent
judgements about who will collect the evidence. Training Package
Assessment Guidelines may provide assistance with who can collect
evidence. It is important, whether it is the candidate, the assessor or a third-
party evidence gatherer, that the instrument and instructions of your
assessment tools clarify what is expected, and provide a clear structure for
them to follow.
Where you gather the evidence will be influenced by the requirements of the
Training Package or course. Most will recommend the workplace as the
preferred setting, where you will need to make sure that safety issues are
considered, and disruptions to the workplace are minimized.
Your obligations
Regardless of the type of evidence that you collect and examine, you are
required to meet the requirements of the AQTF. Before you move to
designing your assessment tools, take the time to consider whether the
Page 85 of 104
assessment methods you have selected enable you to meet the principles of
assessment.
Importantly, they should provide clear guidance and support for candidates
so that there is no ambiguity about what is required of candidates or the
basis on which assessors will make decisions. They can also, if well designed,
be used for recording and reporting purposes.
candidates name
assessor(s) name
date of assessment
unit/cluster title
assessment context
procedure for assessment
list of knowledge/skills to be assessed
competence achieved/outcomes of the assessment
candidate feedback
candidate signature/date
assessor signature/date
Page 86 of 104
instructions to candidate and assessor or other evidence gatherer
resource requirements of the assessment.
The tools that you design must comply with the rules of evidence, ie the tool
must facilitate the gathering of evidence that is:
Your assessment tool gives shape and form to your chosen assessment
method. It must, therefore, be fit for purpose, which means you need to ask
yourself which tool is needed to most effectively and efficiently support your
chosen assessment method. Particular attention should be paid to the
language, literacy and numeracy skill level of the candidates and the
requirements of the units of competency when designing your tools.
Instructions for the candidate and the assessor are an integral part of all
assessment tools. Your instructions should respond to questions regarding
the what, when, where, how, and why of assessment processes. You might
include suggestions on reasonable adjustment to accommodate diversity
and/or advice on your recording requirements for the assessor/observer.
observation checklists
questions to accompany checklists
instructions to candidates and observers/assessors.
a. Observation checklists
vocational skills
employability skills
application of workplace procedures, including OHS procedures.
You should also include clear instructions for the candidate and for the
assessor either on the checklist or in a separate document:
The scenario
The scenario can be a simple card outlining the scenario to the candidate,
any other participants, and the assessor.
You will need to provide scripts for any participants who help to create the
situation.
The following Guidelines for workplace simulation may help the assessor
decide if this is an appropriate assessment method. This is followed by an
example, which includes the instructions/procedures for an assessor and
the assessment instrument, for anactivity that simulates a hazardous
situation.
Page 89 of 104
Training Package requirements and industry views on the use of
simulation
the benefits and limitations of using a simulation
learner characteristics and needs
available workplace opportunities
the cost of establishing and usingsimulated environments
how the simulated assessment can be combined with other forms of
evidence gathering such as logbooks, portfolios or work placements
Does simulation meet the principles of assessment in this unit or
cluster?
Page 91 of 104
Asking questions is a widely used teaching, learning and assessment
technique. Tools that you might develop to support this methodology include:
verbal questioning
written questions
interviews
self-assessment questionnaires
questionnaires
oral or written examinations (may be applicable at higher
qualification levels).
Verbal questioning
Page 92 of 104
Ask the candidate to clarify or re-phrase their answer if the assessor
does not understand the initial response
Confirm the candidates response by repeating the answer back in
his/her own words
Encourage a conversational approach with the candidate when
appropriate, to put him or her at ease
Use questions or statements as prompts for keeping focused on the
purpose of the questions and the kind of evidence being collected
Use language at a suitable level for the candidate
Listen carefully to the answers for opportunities to find unexpected
evidence
Follow up responses with further questions, if useful, to draw out more
evidence or to make links between knowledge areas
Compile a list of acceptable responses to ensure reliability of
assessment
Written questions
Most educators and candidates themselves are very familiar with written
questions in assessment situations, particularly where factual knowledge
rather than its application is being tested. Written questions can be framed
so that candidates are required to:
These two styles are sometimes used in combination to capture the benefits
or minimize the risks associated with each. Clearly, the former is more time-
friendly for the candidate and the person marking the responses, but the
questions can be difficult to construct. Questions that require a response
from the candidate are clearly easier to construct, but take a longer time to
complete and to assess.
Self-assessment tools
Page 93 of 104
Many self-assessment tools use written questions to elicit responses from the
candidate and a number of assessment tools can be adapted for this
purpose.
If these methods are used, it is particularly important that the tools that
accompany them provide crystal clear instructions to both assessors and
candidates.
Page 94 of 104
A portfolio is a collection of materials prepared by a candidate to
demonstrate their knowledge, skills and understanding. It has often been
used as a tool for candidates seeking RPL. New streamlined approaches to
RPL encourage assessment methods that reduce the previous reliance on
paper-based evidence and provide opportunity for candidates to gather
evidence of their competency in a range of ways that better match the
requirements of the unit/units.
Increasingly, methods that are being used to gather evidence for RPL mirror
assessment methods that are used in a training program. These include self-
assessment, interview processes and/or direct observation either on the job
in the workplace, or in a simulated environment.
If you elect to use portfolios, as part of the evidence on which you base your
assessment judgement, your guidelines for candidates need to leave no
doubt as to the intended purpose and expected composition of the portfolio.
Portfolios can be time-consuming to compile and to assess, so if you elect to
use this methodology, you need to exercise care in developing precise
guidelines. Questions of interest are likely to be:
What is a portfolio?
What should it include?
What place does reflection have in the portfolio?
What sections should it contain?
What supporting evidence should be included?
Who will have access to the portfolio, (i.e. its public use)?
What part will it play in the formal assessment of my competence?
Page 95 of 104
Tools for reviewing products
Assessment involves:
Guidelines for use of third party evidence Assessors should put in place
guidelines for the systematic collection of quality third party evidence. These
may be in the form of information, advice and checklists for the relevant
third parties.
Application
Benefits
Page 96 of 104
It is important to support the collection of quality third party evidence as it
offers assessors a cost effective means of gathering authentic and valid
evidence in often difficult contexts. Third party reports can be used
effectively in the evidence gathering process when:
Considerations
There are several things to consider when preparing guidelines for gathering
third party evidence:
Inviting feedback from your peers, candidates, and industry will hopefully
confirm that the tools enable effective collection of evidence, and that the
level of difficulty is appropriate to the qualification level. Differences of
opinions provide an opportunity to discuss and resolve any ambiguities or
misunderstandings before the tools are used with candidates.
Page 97 of 104
Trialling your tools before they are used formally with candidates will enable
you to gauge the user-friendliness of the format, the appropriateness of the
literacy and numeracy levels, the clarity of the instructions and the
practicality of the format for recording assessment evidence and
judgements. It will also enable you to evaluate the suitability of the times
allowed for assessment tasks, and the tools overall cost-effectiveness
(ANTA, 2002).
During the trial, you should also assess the tools degree of adaptability. This
will be determined by its capacity to be adjusted in accordance with
variations in context and the needs of candidates, while still ensuring valid
and reliable assessment decisions.
Page 98 of 104
9.0 COMPETENCE CERTIFICATION
Page 99 of 104
Recognize broad transferable and generic skills as well as
specialized industry and professional skills.
Be internationally credible.
Have clear indications of entry requirements wherever applicable.
Specify quality assurance requirements for training delivery and
assessment (unified and impartial).
Provide an indication of the relationship to other qualifications
wherever applicable.
Specify clearly the competencies to be achieved for the award of
the qualification.
Record of Achievement*
National Certificate
National Diploma
National Higher Diploma
9.3.1 Record of Achievement* -Records of Achievements are awarded
for those who demonstrate competence in some but not all of the units of
competence forming a National Certificate or National Diploma. Records
of Achievements are useful as an individual reference for learners and
employees who have yet to attain all the requirements to be awarded a
National qualification.
All certificates shall bear the logo of the testing center/training provider,
the logo of the Council and the National emblem.
All certificates and records of achievement shall carry provision for the
logo of the training provider or accredited establishment along with the
logo of the Council and National emblem.
All data of certificate holders are stored at the Councils data base.
RPL provides for the formal recognition of the skills, knowledge and
values that have been developed and acquired in a range of different
ways. This is done by assessing the respective candidates against
nationally agreed standards that are registered on the TQF. The
This is important in order to promote a unified TVET system and make the
similarities and differences between the new and old qualifications
transparent to learners, employers and other stakeholders.