Performance Solution Provider

Welcome to our Blog

Latest Posts

in Assessment Practices

Trainers, Assessors or Talent Development Professionals?

21 August 2018

How did we get here? And how are we preparing ourselves to deal with current and future challenges?

Trainers have used the concept of competency-based vocational education since the 1800s. “As man invented tools, weapons, clothing, shelter and language, the need for training became an essential ingredient in the march for civilisation.” (Cloyd S. Steinmetz, Training and Development Handbook: A Guide to Human Resource Development, 2nd Edition, ASTD, 1976)

Competency-based vocational education was introduced as a component of the tertiary education system, in the second half of the 20th century. The purpose of university education differs from vocational education, while university education seeks to disseminate and further develop human knowledge, vocational education seeks to transfer competencies required to successfully perform a job.

Steinmetz provided a relevant definition for our profession as vocational trainers “... a man had the ability to pass on to others the knowledge and skill gained in mastering circumstances”. That vocational axiom hasn’t changed with the adoption of competency-based education in an institutional setting. We still need vocational trainers to transfer the skills and knowledge they developed while performing their occupation (the occupation there are teaching).

The objective of RTOs and trainers in VET is to produce job-ready graduates. And we do that using vocational training programs developed on performance objectives, based on tasks currently performed by workers in the relevant occupation.

The need to incorporate foundation skills, values and attitudes, to the set of cognitive and non-cognitive skills required to perform a job, has been a continuous issue for VET, considering the diversity of learners’ backgrounds and their levels of ability.

But the world of work is today’s biggest challenge for the VET system, and consequently for trainers. The increase in human knowledge and rapid pace of changes in the world of work puts the VET system and content validity under pressure.

Changes in competency benchmarks requires continuously adding and subtracting skills and knowledge to training programs and, therefore, intensive involvement of trainers in professional development activities used to maintain industry currency.

The time required to develop Training Packages, conducting tasks or occupational analysis to develop the performance objectives, used by RTOs to design their training curriculum, put the system at risk of not being responsive enough to industry needs.

The concept of accountability inherited with the definition of competency-based education, is now conceived beyond the Training Package description and requires its interpretation for a specific use. Trainers need to master the skills of interpreting/unpacking training packages to create relevant training for students that will use the skills and knowledge learnt in the workplace context.

Where is our profession going?
The explosion of technology, including computer-based training and online learning, put new tools in the hands of trainers. Yet, how we conceptualise vocational training has also evolved. VET is no longer merely focused on acquiring skills through training, now consideration must be given to partnering with businesses to deliver impact.

The success of VET education lies in trainers accepting the concept and understanding how to implement it. Research indicates that trainers tend to teach the way they have been taught, this suggests that prospective trainers should be trained in a competency-based setting if they are expected to implement this type of education.

Trainers’ credentials must provide a solid base on instructional design, adult learning principles and competency-based education systems. Relevant trainers’ forums are required to share, moderate, and further develop perspectives and skills.

We are in the profession of talent development; our profession demands continuous professional development to respond to the dynamic forces that transform the world of work.

The value of trainers is not defined by compliance, yet the activities performed by trainers must meet quality expectations described by compliance, but we can’t be trapped in the Standards’ box.

Our job is to develop human capital and support the industry to perform better, but trainers also “... serve the needs of minorities, including older persons and the handicapped, and to provide for equal opportunity and non-discriminatory treatment. Such social growth factors are among our greatest assets and are needed in the release of human greatness." (Cloyd S. Steinmetz, The History of Training, ASTD, 1977)

in Quality and Compliance

Compliance starts with communication

21 August 2018

When you look at the statistics published by ASQA, and the results from its regulatory work, they show the darkest face of VET. Since AQTF 2001 was implement, I wonder if we have been stuck in a continuous cycle of simply raising awareness and checking the compliance box.

Most of the work around compliance has been about the regulation: “The VET Quality Framework”, and not about what we are regulating: “Competency-based training and assessment practices”. But the true behaviour change we need, requires activities to help VET practitioners to visualise and build skills required to implement a complex competency-based training and assessment system, not legal practitioners working to comply with legal instruments.

We failed to create formal or informal opportunities to build and apply principles of instructional design that supports competency-based vocational education relevant to the Australian VET environment.

Consultations and other consultative approaches have not successfully exposed potential pressure points within our system, neither explored competing perspectives of different VET players that occur in real life. Instead, policy makers, regulators, training providers and industry continue to work in isolation.

How we communicate the core message
Compliance is a difficult issue to tackle because it is about performance and, therefore, it depends heavily on communication and practice.

Internal communication within RTOs
Managers need to play a key role in the effective communication of these performance objectives. Simplifying current regulatory statements, providing clear and relevant points about abstract policies and showing how those policies apply in real-life situations, are other ways to promote better understanding.

Individuals working in the sector need to build the skills required to interpret regulatory compliance and implement complex technical processes. Such training must be contextualised into the RTO’s environment this can be done by using relevant and real examples, instead of using “ideal world” examples which pose the risk of disbelief.

The discussions about compliance must value the diversity of perspectives and experiences within the sector, incorporating them will provide a frame of respect and sensitivity around vocational training and education issues; it can also promote greater efficiencies within the system. The notion of “this is how we do it” from one side, or “this is how it’s always been done” from the other side, can be a blocking mechanism to effective change and growth as a sector.

It’s no secret—countless research studies have demonstrated that behaviour change unfolds in phases and takes time, consistency and relevant interactions. Changing the behaviours described in ASQA’s report will not be the exception.

By providing open communication channels and relevant training, RTO managers can help staff (including trainers) understand the effect of their decisions and the difference they can make to achieve quality outcomes and compliance.

The above rationale can be extrapolated to other players within the system, including the regulators. Issues with inconsistencies and poor performance of ASQA’s auditors, training package writers and policy makers, can be improved by adopting open communication channels and relevant training.

In competency-based vocational education not only students are accountable for their own learning, all stakeholders are accountable for their own performance.

in Training Evaluation

Is VET Trapped in The Capabilities Vs Performance Issue?

2 November 2017

Frequently, I encounter VET practitioners whose actions and comments indicate an assumption that building capability and enhancing performance are the same thing.

Learning alone will not yield performance results. There is no business or performance measure that improves because of what people know; these measures improve because of what people do with what they know, and VET practitioners have not only no control of what our students do with what they learn, but very little is done to measure performance results.

What Is the Difference Between Capability and Performance?
Enhancing capability or skill is a learning outcome. It means that people have the capability to perform in some manner. It does not mean that they will.
A performance outcome occurs when people take what they know and turn it into what they do on the job. And, of course, making the conversion from learning to doing requires a work environment that supports the capability that was developed.

Engaging industry stakeholders in the planning of our Training and Assessment Strategies will help individuals and organisations to use the capabilities we develop in the VET sector, to improve performance.

I good process to go through with industry stakeholder is to review the "skill. . .will. . .hill. " process; and work together in better training evaluations.

People develop skills but then need both the will (motivation) to apply that skill, and ability to overcome any hill (obstacle) in the work environment that could impede application. Only then can performance result from capability that has been developed. For this to happen, we need more and better collaboration between RTOs, SSOs and Industry.

We know that performance is what people do on the job. We also know that, too frequently, people acquire capability that they never use on the job. Yet VET training is expected will yield results. Training Package Developers play an important role here.

As VET professional, we need to make performance—and not just learning—our business. And we can do that in two ways:

  1. We keep clear in our minds the difference between skill and performance. Training Packages are Occupational Standards and should focus on outcomes and performance.
  2. We view the building of capability as a means to the end, not the end. Our end goal is to enhance on-the-job performance that benefits the organisation. Industry engagement will provide information about how the work environment will support skills we plan to develop. We need to partner with industry who can work with us to ensure skills will transfer to the workplace.
in Learning Design

Can VET Match Micro-learning Solutions?

31 October 2017

Using Skill Sets to meet industry needs.

Vocational Education and Training must provide solutions and support individuals and industry in Vocational Preparation and Vocational Development (Continuous Professional Development).

Although our VET system is a leader in Vocational Preparation, mainly because of government funding conditions, RTOs are losing opportunities in Vocational Development programs.

Non-accredited training programs are providing an incredible range of learning opportunities to support our workforce with professional development. These programs are presented in different formats, from online platforms and symposiums, to summits and conferences. And, importantly, these micro-learning options are meeting current industry needs.

To compete in a corporate training and development world, RTOs should look at these opportunities, and use micro-learning techniques to meet that demand.

The flexibility of training packages that allows for the delivery of stand-alone units and skill sets, is not recognised for government funded programs, which today accounts for more than 70 per cent of all VET training delivered in Australia.

Rapid changes in industry processes and technological advances, together with the definitive adoption of robotics in the workplace, have created a growing need for continuous development of skill sets.

The Australian government should update funding programs to include skill sets and stand-alone units, as this is the easiest way to measure the return on investment in these training programs.

I started looking at international trends for micro-learning, and discovered some interesting statistics. According to the Association for Talent Development (ATD) 92 per cent of organisations (worldwide) are using micro-learning plans, and over 67 per cent of organisations not using micro-learning are planning to start. For RTOs to develop industry relevant training products, we should look at these statistics.

Micro-learning techniques have three primary benefits and this is why organisations are considering these options:

  • Micro-learning is cheaper and faster. Materials take less time to source, produce, maintain and consume, than full qualifications. This enables re-use and re-packaging of micro-learning programs. It also allows trainers to focus on quality without sacrificing amount of training, because those irrelevant skills are not included in the program.
  • People are more engaged. Employees today devote 1 per cent of their time to learning (roughly 24 minutes a week), check their phones 150 times a day, and switch tabs every minute. Micro-learning fits perfectly into this continuous diet of email, Slack, and social media.
  • People learn more. Though there are many factors that drive effective learning, managing cognitive load is one of the most important. The problem with typical learning experiences like lectures or long e-learning videos is that they present too many things at once for too long a period of time.

These are real benefits, but they don't necessarily translate to improved performance on their own. Through industry consultation we discovered that timing plays an important part, and the key is to have a training solution to solve current problems.

One of the most difficult and least scalable things organisations must do is motivate their employees, and learning requires a lot of sustained motivation. Compliance training is a good example.

But how can we identify the right time when our participants' motivation is high?

There are reliable triggers that open up motivational windows in which individuals are willing, even excited, to learn. These windows can last from a few months (Think: when someone is given a new role or responsibility), to a few weeks (Think: when someone has a big deadline or presentation coming up), to a few minutes (Think: when someone is walking into a big meeting for which they're not fully prepared).

In today's competitive environment, RTOs are required not only to set Learning Objectives to describe what participants will be able to do at the end of the training, but also Application Objectives to determine how and when those skills and knowledge can be used and applied to attract participants at the right time (highly motivated).

Learning experiences presented to learners at the wrong time will produce little or zero results, and the margin for error is very slim.

Continuous review of our VET Sector, Training Packages, funding arrangements is required, and our Nationally Recognised Training System should be adapted to meet emerging needs in vocational education and adult learning trends. This new generation of micro-learning solutions is certainly making an impact.

in HR Management

Three Reasons Why Compliance Training Fails

31 October 2017

We are in the training industry, yet many training programs, including some formal training programs, fail to have a positive effect on our RTO's performance.

In this article, I will analyse the top three reasons why RTO Compliance Training Fails.

Lack of Alignment with RTO's Needs
The payoff from a training program comes from the business measures that drive it. Simply put, if a training program is not aligned or connected to a business measure, no improvement can be linked to the program. Too often, training is implemented for the wrong reasons – a trend, to meet regulatory requirements, or perceived need that may not be connected to an RTO's measure.

Initial training needs may be linked to the objectives and evaluation by using a consistent four-level concept:

  1. Reaction (How we want students to perceive the program and its outcomes)
  2. Learning (What new skills and knowledge we want students to learn)
  3. Application (How we want students to use the new skills)
  4. Impact (What RTO performance metrics we want to change)

Without the business connection at Level 4, the program will have difficulty achieving any results.

One major RTO faced this problem directly as it reviewed its Trainers' Professional Development Plan. Several PD sessions were conducted to further develop trainers' skills and knowledge to assess students. The PD sessions were not connected to any RTO performance metric, such as number of non-compliances in clause 1.8, number of rectifications identified in validations, etc. The PD sessions were also not connected to the RTO's operations and participants couldn't use procedural skills back on the job, and therefore, the RTO didn't improve assessment practices.

Failure to Recognise Non-Training Solutions
If the wrong solution is implemented, little or no payoff will result. Too often, training is perceived as a solution for a variety of performance problems when training may not be an issue at all.

A recent evaluation of a community college illustrated this problem. In its training program, the college attempted to prepare career counsellors so they could provide advice to potential students about training products. The problem the college had was a significant number of students enrolled into inappropriate courses. This meant the training produced little change in the outcomes.

An impact study subsequently revealed that the culprit was the enrolment procedure that accepted enrolments prior to potential students' interviews with career advisers. When probed for a reason for the poor results, the college realised that unless its enrolment procedure changed to provide time for career advisers to interview potential students prior to enrolments being accepted, the results would not change.

Attempting to solve job performance issues with training will not work when factors such as systems, job design and motivation are the real issues. To overcome this problem, staff training must focus on methods to analyse performance rather than conduct traditional training needs assessments – a major shift in performance improvement that has been developing for many years.

Up-front analysis should be elevated from needs assessment, which is based on skills and knowledge deficiencies, to a process that begins with business needs and works through the learning needs.

Lack of Specific Direction and Focus
Training should be a focused process that allows stakeholders to concentrate on desired results. Training objectives should be developed at higher Kirkpatrick levels than traditional learning objectives. These objectives correspond with six measures that lead to a balanced approach to evaluating the success of training. Most training programs should contain objectives at multiple levels, ideally including those at Levels 3 and 4.

An RTO's internal training is often decided without consulting all stakeholders. What are the RTO's performance needs for the CEO, the Marketing Manager, the Training Manager, the Quality and Compliance Manager? When developed properly, and in consultation with all relevant stakeholders, these objectives provide important direction and focus.

Training designers and developers must focus on application and effect, not just learning. Facilitators need detailed objectives to prepare individuals for the ultimate outcomes of the learning experience: job performance change.

Participants need the direction provided by Level 3 and 4 objectives to clearly see how the training program's outcome will actually help the RTO.

Not all programs will need to undergo such detailed up-front analysis, but it is a critical issue that needs more attention, particularly when training is expected to have an effect on the RTO's performance.

in Learning Design

VET Is Not About Content

17 September 2017

Too many trainers still stand behind a podium, relying on content to drive learning. It's time for those trainers and instructional designers working in the Vocational Education sector to realise that content is not what drives learning in VET.

We don't teach content, we teach people. We teach people to achieve outcomes, to perform a job under industry standards.

Some factors to consider

Information overload. Students can watch speakers, read information sheets and research content at any time. Students need the interaction, the engagement and the experience.

Internet provides access to info graphics, case studies, blogs, podcasts, videos, tweets, about almost anything. As VET practitioners, we can make good use of them, but these publicly available resources will not make a relevant learning experience for our learners.

We live in the Information Age—and there's too much of it! For example, according to some estimates published by the Association for Talent Development (ATD), there are more than 120,000 books and texts on leadership development, with 3,000 more being published each year. We don't have a content problem; we have a filter problem. We must filter that content through the context of whom we're trying to connect with and teach.

Content is what we're pouring into people. Context is everything that makes those people unique. It's why they're doing the training: the conditions where they will be applying their learning, the expectations of their clients and workplace. It's their age, interests, attention span, engagement level and beliefs.

People learn in the silence. We learn in the pauses, reflection and meditation. Don't you have your best ideas when meditating, in the shower, while driving, or when falling asleep? We learn in the spaces in between life. We can't deliver lectures to learners anymore; that's not how people learn.

Content is only one part of the equation. VET programs should always be based around the learn-say-do-reflect model. It's about providing an experience. We can't teach someone to ride a bike or drive or how to use new technology without putting them on the bike or in the car or the device in their hands.

Attention-span deficit. We live in the digital era, where our mind switches on and off every 5 to 20 minutes. The average song you listen to is about three to four minutes. The average watching time of a YouTube video is three to five minutes. Any scene in a movie runs between a quick moment and no more than 15 minutes before switching to a new scene. It takes no more than 15 to 20 minutes to read any article in any paper. TED Talks are 18 minutes. Stories in the news last no more than a few minutes, unless they are documentaries.

We can't lecture or speak to learners (of any age) for more than 15 to 20 minutes at a time. Their attention will be gone after that. People start wondering what's next. They check their smartphones. They look at the clock.

Students need more space. Spaced learning is about engagement, conversations and one-to-one interaction. It's about exercises, simulations, demonstrations, and students teaching students. Spaced learning is about reflection, giving participants time during the session to turn their insights into actions.

After a training course, people are going back to their lives, their desks, their email and texts, or the next most important thing on the list, but not back to reflect.

VET programs must provide the framework to support student's learning, post training. We need to provide action plans, explain exactly what they need to do immediately to get to the next level, and how to progress. In other words, follow-through on promises made with the learning objectives.

VET is not about content because we don't teach content, we teach people. We facilitate learning experiences. That's what we do.

in Quality and Compliance

Trainer’s upgrade: a cost or a solution?

13 September 2017

Since the Assistant Minister for Vocational Education and Skills, the Hon Karen Andrews MP, announced the most recent amendment that affects the requirements for trainers and assessors to work in VET, many RTOs' managers and trainers have considered this as another compulsory course that will be recorded as an operational expense.

Like other compliance requirements, many people in this sector believe there has been limited analysis regarding how this training will affect an RTO. Yes, it means the RTO will comply with the standards, but will the added cost solve any problems?

If this training doesn't have a positive effect on the stakeholder's performance and results it is not a solution. In this article, I would like to analyse the desired and potential effects of this TAE upgrade for RTOs and trainers.

Firstly, let's clarify the requirement. Under the updated Standards for RTOs, trainers and assessors using the TAE40110 Certificate IV in Training and Assessment as their teaching credentials must hold the following two units before 1 April 2019:

  • TAEASS502 Design and develop assessment tools, and
  • TAELLN411 Address adult language, literacy and numeracy skills.

Why are trainers required to further increase skills in developing assessment tools and addressing adult LLN skills?
According to statistics published by ASQA, approximately 75% of RTOs fail to demonstrate compliance against assessment practice requirements, and matching LLN skills of students with course entry LLN levels.

Is there a performance issue?
Yes, there is a clear performance issue with assessment practices. Assessment systems used by RTOs are not meeting training package requirements, principles of assessments, and do not produce sufficient, valid, authentic and current evidence.

The second issue is related to students being enrolled into courses without determining whether entry LLN skill levels have been met.

What is happening or not happening?
Based on my experience as an auditor, I have identified five critical factors that affect RTOs assessment practices:

  1. Units of competency are not unpacked effectively.
  2. Assessment evidence is not analysed correctly.
  3. Assessment collection methods, tasks and evidence are poorly mapped to the unit of competency requirements.
  4. Adequate instructions are not given to assessors on how to administer assessment tools and interpret assessment evidence.
  5. Inconsistent administration of assessment tasks.

Can these issues be solved with training?
We can only solve problems with training if there is a gap in skills. And yes, trainers and assessors currently working in VET have significant gaps in skills/knowledge, particularly those required to:

  1. Interpret units of competency.
  2. Develop effective assessment tools (instructions and tasks) to collect evidence against the requirements of units of competency.
  3. Implement assessment practices in line with the Principles of Assessment, and
  4. Collect assessment evidence that meets the relevant unit of competency requirements and the Rules of Evidence.

But performance issues go beyond an RTO's assessment practices. They directly relate to gaps in the skills of its trainers, lack of support and effective quality assurance systems, which play an important role.

Is TAEASS502 Design and develop assessment tools the solution?
It could be, but it won't if we continue to do the same as we have being doing with the previous upgrades BSZ to TAA and TAA to TAE.

Let's start with the outcomes included in the unit. TAEASS502 elements are:

  • Determine the focus of the assessment tool
  • Design the assessment tool
  • Develop the assessment tool, and
  • Review and trial the assessment tool.

This unit is relevant to four out of the five performance issues listed above, and will provide trainers with the opportunity to develop at least the first two sets of skills listed in the skills gap.

When the training solution is designed, developed, delivered and assessed, the impact objectives must be considered. In other words, this course must be adopted not only as the training to meet the new requirement under clause 1.14 (trainers' credentials), but as the training solution that will support the RTO to meet the requirements under clause 1.8 (assessment practices).

Considering the structure of the Standards for RTOs, being non-compliant with clause 1.8, will also produce non-compliance with clauses 1.4, 1.12, 2.1, 3.1 and 8.4. Furthermore, this course should also have a positive effect that improves the compliance status with clauses 1.9, 1.10 (validations) and 1.16 (trainers' relevant PD).

In summary, a Statement of Attainment with the TAEASS501 unit can give the RTO a tick in clause 1.14, but the real benefit, and return on investment, will only happen if trainers develop the skills required to perform the necessary tasks to meet requirements under clauses 1.4, 1.8, 1.9, 1.10, 1.12, 2.1, 3.1 and 8.4.

If we compare the cost of the course with the benefits of maintaining compliance with clauses 1.4, 1.8, 1.9, 1.10, 1.12, 2.1, 3.1 and 8.4, the potential positive return on investment is evident. RTOs then should see this course as an investment and not a cost. An investment that will produce real, tangible benefits far greater than the investment itself, and I will suggest that RTOs should measure this benefit.

Obviously, if the course doesn't produce a positive effect on operations, the investment will become a cost. This means it is critical that RTOs discuss with the training provider the desired application and impact objectives for the course.

RTOs will need to ensure trainers and assessors will have the opportunity and the support to apply the skills learnt. This may require a change to current practices. For example, trainers should be more involved in designing, developing and reviewing assessment tools, and validation processes may need to be strengthened so they have a greater effect as quality review and control processes.

How can we measure the application of the skills? What data needs to be collected?
There are some points that need to be considered here:

  • What new knowledge will be applied?
  • What new tasks will be performed? What new steps?
  • What new procedures and processes will be implemented or changed?
  • What new guidelines will be implemented or changed?

The answers to the above questions will help us to determine what data will be collected.

For a standard RTO, new tasks could include: interpreting unpacking units of competency, analysing assessment evidence required, considering learners' needs during the design of assessment tools, considering the rules of evidence during the design of the evidence collection plan, or reviewing mapping documentation. These tasks/steps will have an effect on the processes of designing, developing, and using assessment tools, and for this reason, RTOs must review/update procedures and guidelines that are already in place, to support the application of the new skills.

The reason to measure the application is not only to confirm the success of the training, but also for continuous improvement. The analysis of the application data should reveal if the skills could be enabled or if there were any barriers. The RTO can use this information to overcome barriers and better exploit various ways to maximise the positive effect on the assessment practices.

How can I measure the effect of the application of the new skills?
At this level, the RTO wants to measure the effect on assessment practice outputs, quality, cost and time.

With regards to outputs, the RTO could measure increase in the number of assessment tools developed (whether developed completely in-house or based on customising commercially available products), or increase in the number of assessment validations completed. With regards to quality, the RTO should measure the number of rectifications identified in validations, internal audits, and number of non-compliances identified by ASQA. Costs can be determined by measuring the reduction of costs associated with engaging external consultants to develop assessments, costs associated with rectifying assessment tools, and/or assessment evidence collected. Finally, the RTO can measure, for example, a reduction of time required to develop/modify assessment tools.

The opportunity is there and whether this upgrade will have a positive effect on our VET sector will depend on RTOs and the trainers' approach.

in Assessment Practices

Collecting relevant assessment evidence

8 August 2017

Don't tick the wrong box

Assessment systems continue to be the most challenging area in an RTO's operations and yet the most critical to demonstrate quality outcomes. When dealing with assessments in Australia's VET environment, we need to consider both the assessment system used by the training organisation, and the outcomes produced by that system. The "assessment evidence" is collected and used to make a competency judgement against the unit(s) of competency.

I would like to use this article to reflect about the "assessment evidence", and particularly to assessment evidence used to support decisions around completed tasks, and the demonstration of skills.

Quite often in my work as an auditor, I see "Observation checklists" based on tick boxes next to text copied/pasted from the unit of competency's performance criteria.

Assessment activities used to produce evidence of a candidate's skills, will always require a task to be completed, under the conditions and standards (relevant the unit of competency element and performance criteria), and will provide candidates an opportunity to demonstrate the skills required to perform the mentioned task. Knowing is not the same as doing, and VET is about doing. That is a fundamental principle for the design of the assessment, but as I mentioned above, I will focus here on the evidence produced, and not so much on the task itself.

Do we have rules to accept assessment evidence in Australia's VET sector? Yes, the rules of evidence are: Valid, Authentic, Sufficient and Current, and these rules must guide assessors during the collection of evidence.

Ok, let's start with Validity. What is considered as "Valid" evidence? According to the Standards for RTOs, evidence used to make a competency judgement must confirm "...that the learner has the skills, knowledge and attributes as described in the module or unit of competency and associated assessment requirements." In other words, the assessment evidence collected confirms the candidate's ability (performance evidence and knowledge evidence) to achieve each outcome (Element) described in the unit of competency, under each condition/standard (Performance Criteria).

How can we prove that an outcome has been achieved? The evidence must provide details about: what was achieved, when it was achieved, in which context. A tick in a box will not provide that information.

Some assessors think they can just tick candidates off as competent, based on their "professional judgement". And on occasions they felt insulted when evidence, used to make the judgement, was requested. Quite often, I hear... I used my criteria from 20 years of working experience. To be very clear here, I am not questioning an assessor's industry experience. I celebrate that. But competency-based assessment is an evidence-based system. In other words, judgement is made based on evidence collected.

When someone performs a task, it results in either producing a product or delivering a service. If a product, or sub-product is produced, the product itself will constitute valid evidence that the assessor can then assess against a benchmark (the unit). Assessing products requires comparing the product's characteristics, features and use, to the outcomes described in the relevant Element/PC from the unit. Assessors can use records of the product's characteristics, for example, if the product is an object you can have details of physical characteristics (length, size, weight, height, resistance, conductivity, etc.), or if the product is something more intangible such as a plan, some characteristics that can be recorded and assessed could include content, relevance of information provided, usability, veracity of instructions, feasibility of projections/forecasts.

If the task is a service (delivered to internal or external clients), records of the service provided will constitute valid evidence. For example, if the service is to resolve a customer complaint, evidence could include records of the complaint resolution, feedback from the client, photos, videos and records of the observation of the candidate dealing with the client (details of the protocol/procedures followed, techniques used, skills demonstrated, etc.).

The quality, quantity and relevance of the evidence collected must support the assessor's judgement. In general terms, learning in the vocational education and training spectrum means a consistent change in the candidate's attitudes. In other words, the candidate is able to use the new skills and knowledge consistently, and apply them in different contexts and situations.

The above means that evidence must demonstrate that the candidate had performed the task(s) more than once. In some cases, the unit of competency indicates a specific minimum number of occasions for a task to be performed. RTOs should use industry engagement activities to determine a benchmark for sufficient evidence, in line with industry standards. This is the requirement under the rule of sufficiency.

The assessment evidence constitutes a legal document and as such, the authenticity of the evidence is paramount. How can we prove that the evidence presented was either, produced by the candidate, or talks about the candidate? What measures are we using to demonstrate authenticity? In VET, there are three types of evidence we can use Direct, Indirect or Supplementary.

When collecting direct evidence, it is important that the identity of the candidate is confirmed and that the assessor observed or witnessed a task being completed or through oral questioning, and details of the event registered (i.e. date, time, location, duration).

Tools used to produce indirect evidence, such as finished products, written assignments, tests, or portfolio of evidence from a project, must include measures to confirm authenticity. This could include photographic or video evidence, further questioning from the assessor about the procedure(s) used to complete the task and how that procedure would be adapted if the situation/context was different. Many RTOs use a "declaration of own work" by the candidate as well.

Supplementary evidence produced by third parties such as supervisors, colleagues, or clients, can represent a challenge. This evidence is usually produced in the workplace. Measures to prove authenticity could include using referees to confirm the claims made in the third-party reports, or providing an opportunity for the assessor to visit the workplace for further observations/interviews.

Finally, evidence collected must meet the rule of currency. This may be particularly challenging in an RPL assessment. Assessment evidence must prove that the candidate demonstrated the relevant skills and knowledge at the time that the competency judgement was made, or in the very recent past to the judgement. What constitutes a "very recent past"? In some cases, the unit of competency provides information about currency, if no information is provided in the unit, RTOs should use industry engagement activities to establish a criterion for currency, in line with industry standards. In general terms, any evidence collected, or about something that happened more than two years prior to the assessment, judgement is potentially not current (in some industries evidence from more than two years ago may be accepted).

The bottom line here is that an assessment judgement must be made after the assessment evidence has been collected and compared against the unit requirements.

The assessment evidence recorded (the facts of the case), demonstrates the learner has the skills, knowledge and attributes as described in the unit, and represents the legal proof of the competent/not yet competent judgement, that will be available for eventual procedures such as appeals, validations, audits or reviews. This evidence must meet the rules of evidence otherwise the RTO will be in breach of Clause 1.8 of the Standards for RTOs.