Skip to content

Section 3: Assessing

Gathering evidence

Assessing knowledge

Assessing underpinning knowledge is not always as easy as assessing practical skills. In the case of hands-on skills, you can observe the candidate performing the task and tick off each component on your checklist as you watch them do it.

But knowledge is invisible until it's either put into practice or described in words. And there are many times when both of these ways of expressing knowledge have their drawbacks.

For example, if you wanted to test a candidate's knowledge of what they would do in the event of a fire on-site, it would be impractical to set the building alight and watch their performance.

On the other hand, if you gave them a written test they may have trouble putting their answers into words and writing them down.

There are ways, however, of collecting evidence of underpinning knowledge that will give you enough information to decide on a candidate's competence. These are described below.

Verbal questioning

For people with poor literacy skills, a verbal question and answer session is the fairest way of testing their knowledge. Depending on the competency you are assessing, you may choose to tick off the questions as they are answered correctly, or write down the answers you are given, or even record the session either as an audio file or video clip.

The effort you go to in recording the evidence will relate to how significant it is in your overall judgement of the candidate's competence. For example, if it is a short list of 'yes/no' or procedural questions used to test background knowledge during a comprehensive practical demonstration, you may decide to use a simple checklist.

On the other hand, if the questions form a major part of the assessment evidence, you should always keep a documentary record of what the candidate says. This is particularly important for competency assessments that are linked to licences or other types of formal accreditations involving third parties or government agencies.

Here are some tips on asking effective questions:

  • Make sure you have the full list of questions you want to ask before you start, so you don't forget anything important.

  • Be clear in what you're asking, and if there are several parts to the question, ask each one separately.

  • Don't ask leading questions - that is, questions that imply an answer - such as: 'You wouldn't use a shifting spanner for that job, would you?'

  • If you want a candidate to explain more fully what they mean, ask follow-up questions to prompt them to give you more information.

Open and closed questions

Open and closed questions are both good ways of finding out how much a candidate knows about a subject or activity. However, they have different functions, depending on what type of information you're looking for in the answer.

Closed questions require a 'Yes' or 'No' answer. Examples are: 'Have you used this tool before?' and 'Have you completed a WHS induction session?'

Open questions allow the candidate to 'open up' with an explanation or description. Examples are: 'How would you report a malfunction to your supervisor?', or 'What procedure do you need to follow in an emergency evacuation?', or 'Why is it important to shut off the valve before you disconnect the unit?'.

You can also use open question to pose 'what if' scenarios, such as: 'What would you do if you heard an unusual noise coming from the motor?'. These are sometimes called problem scenarios, and are an excellent way of testing the candidate's understanding of what to do when things go wrong.

Written tests

A written test is an efficient way of determining a candidate's theoretical knowledge, especially when you are assessing several people at once. It also helps to ensure that everyone gets exactly the same questions and are treated equally. Once a test has been completed, it provides good documentary evidence that can go on file.

However, its biggest disadvantage is that not all candidates have sufficient literacy skills to cope with a written test. For people who can't do the test on their own, you need to identify their literacy difficulties beforehand, and either help them read the questions and write their answers, or conduct the test verbally.

Note that when you are deciding on the types of questions to put in a test and how it will be structured, you should bear in mind the general rule that the literacy skills needed to do a test should never exceed the literacy requirements of the job itself.

Presentations

For particular subjects, presentations can be a great way of getting the trainee to research a topic and then explain it back to you in their own words.

In a group situation, participants can deliver their presentations to the group, allowing everyone to learn at the same time.

When you're assessing a candidate while they're delivering a presentation, however, you need to be careful that you're assessing their understanding of the topic, rather than their presentation skills. The only exception, of course, is when their presentation skills are part of the performance criteria for the competency, such as in some communication competencies.

Third party reports

There might be times when the worker you're assessing is a member of your normal work team. In these cases, you may already have a clear idea of their underpinning knowledge of the task.

In other cases, you may not know the person at all, or they may work in an area that you don't have much involvement in. These are the instances where a third party report is a good way of verifying the candidate's background knowledge and capacity to get a job done properly.

Third party reports are generally written by the candidate's immediate supervisor or manager, and provide details on their ability to apply their knowledge and skills to the job under normal day-to-day conditions. It's helpful if you provide a template or checklist for the supervisor to complete, because it lets you explore the 'range of variables' and 'dimensions of competence' in the report.