The 2013 training program has four formative workplace-based assessment tools
To introduce workplace-based assessment, former Chair of the Workplace-based Assessment Committee, Dr Richard Horton, talks about the rationale for workplace-based assessment and the purpose of each tool, in a podcast that can be accessed here.
Access and learn about the toolsThe mini clinical evaluation exercise (mini-CEX), direct observation of procedural skills (DOPS) and case-based discussion (CbD) can be completed within the training portfolio system or on paper but then submitted to the system by the assessor. The multi-source feedback (MsF) feedback forms should be completed on paper, returned to the supervisor of training by the individual providing the feedback on the form, then consolidated and reviewed by the supervisors of training. The outcome should be presented to the trainee by the supervisor of training and also recorded in the training portfolio system by the supervisor of training.
The mini-clinical evaluation exercise (mini-CEX) provides supervisors with a structured assessment and feedback format for the relevant clinical knowledge (including reasoning and understanding), skills (technical and non-technical) and behaviours related to the trainee’s management of a single clinical case.
The trainee and the supervisor agree on an appropriate case before the assessment starts. The case should be one that the trainee is able to comprehend and manage reasonably without direct intervention by the supervisor (this is referred to as being at the trainee’s learning edge).
Trainees should be mindful of the need to ask for help as required. Seeking guidance, as appropriate, will be viewed positively in the assessment.
The mini-CEX has three components
- Discussion regarding relevant aspects of clinical knowledge, understanding and reasoning related to the case. The trainee should be able articulate their plan and justify their plan (expected at their level of training). Consideration should be given as to whether this discussion should occur in front of the patient or not.
- Observation of the trainee managing the case. It is important that the trainee is "in the driver’s seat". The supervisor may need to intervene from time to time for reasons of safety and efficiency. The supervisor should note down what supervisory interventions were required and why. This forms the basis of the constructive feedback to assist the trainee in attaining greater autonomy. The supervisor should also note when no intervention was required and discuss this in the feedback
- Feedback. This is the most important aspect of the process. Feedback should be given verbally as soon as possible after the observation. The setting should be private and free from interruption if possible. It should be reiterated that the feedback is for the purpose of training only and will only be shared with ANZCA representatives for that purpose.
Mini-clinical evaluation exercise (mini-CEX) form. Download a PDF of the mini-CEX form.
Direct observation of procedural skills is designed to assess and provide structured feedback about both knowledge and technical proficiency regarding a discrete procedural skill.
The procedure itself may be done as either:
a. Part of usual clinical workload.
b. By simulation (for example, on a part task trainer).
The assessment has three components:
- A discussion regarding relevant aspects of anatomy, indications, contraindications, complications and side effects, specialist equipment required, patient positioning and monitoring, and consent issues. It is useful to ask the trainee to outline how they will do the procedure and what precautions they will take before they start the procedure. Consideration should be given as to whether this discussion should occur in front of the patient or not.
- Observation of the consent process and the procedure.
- Provision of feedback.
Towards the end of the assessment form, there is a global assessment on the level of supervision the assessor believes the trainee requires when performing the procedure. This decision should be based on questioning and direct observation of the trainee's performance. It does not depend on how many times the trainee has performed the procedure or the level of training.
If the assessor believes the trainee still requires direct supervision for this procedure, they need to provide feedback and document in the assessment what the trainee needs to demonstrate in order to be able to do the procedure without direct supervision
Direct observation of procedural skills (DOPS) form. Download a PDF of the DOPS form.
The case-based discussion (CbD) should only require 10-20 minutes of discussion, and the whole process should only take 30-45 minutes.
The trainee brings copies of the anaesthetic records of at least three cases they have dealt with reasonably independently (level 3 or 4 supervision) and the assessor chooses the most appropriate one for discussion. Occasionally the supervisor of training may direct a trainee to have a particular case assessed and in this case the trainee needs to take a copy of that specific anaesthetic record along to the assessment.
Anaesthetic records should be de-identified for privacy and confidentiality reasons.
- The trainee presents the case to the assessor. The assessor puts a brief summary in the field "Case details".
- Suggested foci for discussion are provided in the form. The assessor should include the headings of the foci discussed in the field "Discussion foci". An estimate of the complexity of the discussion should be given.
- The trainee is rated according to how much prompting he or she required to demonstrate adequate reasoning and other skills, for safe care.
- Feedback should be given at the time of the assessment. It should be specific and constructive. The trainee should be given advice on areas that he or she needs to focus on in his or her future study and structures that he or she may find helpful for approaching tasks such as formulating plans.
Case-based discussion (CbD) form. Download a PDF of the CbD form.
This is a formative assessment, which is undertaken once in each training period to contribute towards each core unit review, at which time the results are considered with those of other workplace-based assessments. The MsF should be completed by both specialist anaesthetists and other team members (for example provisional fellows, surgical registrars, senior anaesthetic and recovery nursing staff) with whom the trainee works.
Each trainee coordinates the distribution of the MsF forms to assessors with sufficient time for them to be returned to the supervisor of training. Ideally a minimum of seven forms will be available to the supervisor of training for the MsF to be compiled. To this end, and to meet the number requirements, the trainee should use their judgment to decide how many forms to circulate, assuming that only 50 per cent of colleagues may respond.
It is important to stress that the responses are returned to the supervisor of training before the supervisor reviews and collates a summary response to subsequently discuss with the trainee. This aims to ensure confidentiality, and also allows the supervisor of training to give the trainee a global assessment rather than focusing on individual comments.
Download the multi-source feedback (MSF) form. Download a PDF of the MsF form.
Frequently asked questions
Can I claim CPD credits for performing workplace-based assessments?
You may claim CPD credits for performing certain WBAs with trainees. The value of the CPD program is to encourage a culture of learning, review, reflection and participation. As a guide, the recommendation is that:
• Mini-CEX and DOPS assessments should attract credits from Category 3, level 1.
• A Case-based Discussion should attract credits from Category 3, level 2.
• Teaching colleagues about workplace-based assessment should attract credits from Category 1, Level 1.
• Learning about workplace-based assessment should attract credits from Category 1, Level 1 or 2
• Multi-source feedback does not attract any CPD credits if contributing towards feedback for a trainee.
Are we turning out such poor quality anaesthetists that we need to have such a massive overhaul? Will these assessment tools make better anaesthetists at the end of their training when the consensus is that we already produce good anaesthetists?
ANZCA and all Fellows involved in training can be rightly proud of the quality of anaesthetists we produce. This does not mean we should be complacent and seek ways to improve the quality, consistency and efficiency of training and make the experience better for both the trainees and trainers. The drivers for introducing a workplace-based assessment (WBA) are outlined in the video-cast “Introduction to WBA” and include:
- More frequent and better quality feedback.
- Identification of issues earlier so there is more time for remediation.
- Providing a framework for remediation.
- Mapping progress from novice to expert.
- Certification of what the trainee can actually do.
- Promotion of ANZCA training to patients, other team members, departments and hospitals.
Incorporating a workplace-based assessment is not a “massive overhaul” and does not require a lot of effort because it incorporates things into a structured program that anaesthetists are doing informally. Aspects of the program, such as the online platform, the ease of data entry by assessors and information retrieval by supervisors of training, the easier management of trainees with difficulties, and a streamlined in-training assessment process, means that in some areas less effort will be required.
The current system is designed to assess knowledge and not procedural skills, or attributes and behaviours such as communication, teamwork, management and professionalism. There is consensus that these skills and attributes are equally important when ensuring standards, not only on average, but also the minimum we expect for fellowship.
What is the evidence for introducing workplace-based assessments (WBAs)?
Research into medical education assessment usually looks for evidence of validity, reliability, feasibility and educational impact of any form of assessment. Concerning these areas, the evidence for WBAs compares favourably with assessments such as multiple-choice questions, short-answer questions and viva examinations, which we already use. This is addressed in the video “Introduction to WBA”.
It is difficult to do research that provides evidence that any form of medical assessment produces better doctors because we have not had the tools previously to objectively measure the performance of a doctor. The measurement of the quality of an anaesthetist depends on workplace-based assessment. It would be difficult to compare departments that used WBA with those that don’t because those that don’t would not be able to demonstrate the quality of their anaesthetists.
As trainees can initiate an assessment and choose who will provide them with feedback via multi-source feedback, won’t they just choose people they know will be lenient? How will we stop trainees going to the “doves” for their required workplace-based assessments?
The assessment tools are designed to encourage honesty from assessors. For example the mini-CEX rating scales are based on the degree of independence demonstrated for a given degree of case complexity. Even assessors who want to be nice to a trainee are unlikely to document that a trainee is demonstrating greater independence than they really think as this could potentially harm both patients and the trainee.
Feedback to the trainee is designed to be constructive to benefit a trainee by enhancing their career. Assessors recognise the benefits of helping a trainee with honest feedback.
Trainees are more likely to choose an assessor because they respect their judgment rather than because they are known to be lenient, which means they will take any feedback more seriously.
Requiring trainees to select assessors enhances the feasibility of the WBA program because the trainees are best placed to know who in a department is in the best position to provide feedback.
Regarding mini-CEX, DOPS and CbD, assessors can and should initiate assessments too. They do not have to be asked by a trainee. Supervisors of training can also direct assessments by influencing rostering and requesting specific assessments from assessors. This flexibility allows solutions that will suit different departmental situations.
Why are nine-point scales used in the workplace-based assessment (WBA) program? I prefer scales with fewer points.
Some WBA tools were trialled using a five-point scale, however significant numbers of assessors wished to rate trainees between the existing points as they did not feel this scale allowed enough discrimination.
In a 2009 paper, Cook & Beckman (Adv in Health Sci Edu 14:655-664) compared the traditional nine-point mini-CEX scale to a five-point scale to see if the scale number made any difference to inter-rater reliability and accuracy of the scores. They used videotaped resident patient encounters, most of which were scripted to reflect a specific level of competence. They found that there was no difference in inter-rater reliability but the nine-point scale had greater accuracy than the five-point scale (P<.0001)
And from “Health measurement scales: a practical guide to their development", 3rd ed. Steiner D and Norman G. Oxford Medical Publications 2003, p37: “The optimum number of points is thought to be five to nine. More points gives more precision and reliability but more than nine increases difficulty of use.” They provide the evidence by calculating reliability of scores from a data set with a large number of points and progressively collapsing the scale.
Also from Bondy KN, "Criterion-reference definitions for rating scales in clinical education".Journal of Nursing Ed 1983. 22 (9) pp376-382: "Psychometric studies indicate that the reliability of rating scales increases when the number of scale points increases from two to seven with little additional gain from 11 to 20. Using too many points leads to greater variability without increasing precision.”
How will trainees take receiving ones and twos early in their training when they are used to scoring above average?
While this is a common question when participants first start using the tools, it has not proven to be an issue in practice and this is probably because the rating scales in the mini-CEX, for example, clearly relate to the level of supervision required and not the relative performance of the trainee.
The trainee is likely to understand and appreciate that a high level of supervision is required early in training. Experience from centres currently trialling the tools indicates that most trainees can score six to seven for cases of very low complexity within several weeks to a few months of starting training. This illustrates the advantages of picking cases where the complexity loosely fits the capabilities of the trainee.
The assessors are also asked to rate the level of training that they think the trainee is performing so the trainee can see that the particular score may be appropriate for their level of training.
What are the minimum numbers of workplace-based assessments (WBAs) that a trainee needs to submit?There are minimum WBA requirements for each of the core units, the provisional fellowship training year and for each of the specialised study units. To obtain the full benefit of the WBA program, assessors and trainees should do far more than this and aim to incorporate some WBA into their daily routine.
|Introductory training (six months)||One pre-op assessment, one pain round, four other||One RSI and extubation, one B and M + insertion of LMA, one CICO simulation, one machine check||One|
|Basic training (18 months)||Six||12||12||One|
|Advanced training (24 months)||Eight||16||Eight||One
|Provisional fellowship training (12 months)||Two
|Specialised study units||CbD||Mini-CEX||DOPS||MSF|
|Head and neck, ear nose and throat, dental surgery and electro-convulsive therapy||One||One includes preoperative assessment, one other|
|Neurosurgery and neuroradiology||Optional||One involving the head, one other|
|General surgical, urological, gynaecological and endoscopic procedures||Optional||Four|
|Thoracic surgery||Optional||One||One (DLT)|
|Cardiac surgery and interventional cardiology||Optional|
|Obstetric anaesthesia and analgesia||One GA CS||One (LUSCS), one other||One epidural in labour, one spinal/epidural for CS, one resus of newborn|
|Vascular surgery and interventional radiology||Optional||One revascularisation,
|Paediatric anaesthesia||Optional||One pre-op assessment,
|One block for penile or inguinal surgery, one face mask < 2 yo, one ALS simulated|
|Plastic, reconstructive and burns surgery||Nil|
What are the transition arrangements regarding these minimum numbers?
The transition arrangement is that minimum requirements will be pro-rata of the training time still required within the new curriculum for the core study units. For example, if a trainee has 18 months accredited in advanced training at the start of the 2013 training year, the minimum requirement will be one quarter of the numbers outlined above for advanced training. The exact number and type of workplace-based assessment will be up to the supervisor of training.
Are workplace-based assessments (WBAs) formative or summative forms of assessment?
This has been well answered in a recent review paper on WBA in The Surgeon, Journal of the Royal Colleges of Surgeons of Edinburgh and Ireland (The Surgeon 9, 2011, S12-13): "The primary purpose of WBA is to aid learning by providing trainees with constructive feedback, based on objective, structured assessment (assessment for learning). It has been suggested that renaming WBA as workplace-based assessment for learning (WOBAL) might better explain this primary purpose! Although the principal role of each assessment is to aid learning, a collection of assessments can be used to inform the Annual Review of Competency Progress (ARCP), provided that the trainee has requested enough assessments to provide sufficient evidence. This evidence, collated in the trainee’s portfolio, thus becomes a summative assessment of learning.”
A similar approach is being taken by ANZCA. Rather than an annual review, the collection of assessments will be used to inform the clinical placement reviews (CPR) and the core unit reviews (CUR).
What makes a satisfactory assessment?
Rather than a particular assessment being seen as satisfactory or not, the assessments together give an indication of satisfactory progress. There is a question at the end of the DOPS form that asks “Does this assessment need to be repeated?” and in the mini-CEX and CbD forms, there is the addition of “for this type of clinical case?”. If the answers is “yes”, the assessor is prompted to give a reason why. This provides a flag for the trainee and the supervisor of training to consider additional assessments to be done above the required minimum in order to demonstrate satisfactory progress.
What makes satisfactory progress?
Reasonable expectations for trainees are outlined below:
Low complexity cases (complexity rating 1-3)
- By the end of introductory training, should only need ANZCA level 2+ supervision equivalent to mini-CEX global scores greater than three.
- By the end of basic training, should only require ANZCA level 4 supervision equivalent to mini-CEX global scores greater than six.
Moderate complexity cases (complexity rating 4-6)
- By the end of basic training, should only need ANZCA level 2+ supervision equivalent to mini-CEX global scores greater than three.
- By the end of advanced training, should only require level ANZCA level 4 supervision equivalent to mini-CEX global scores greater than six.
High complexity cases (complexity rating 7-9)
- By the end of advanced training, should only need level ANZCA 2+ supervision equivalent to mini-CEX global scores greater than three.
- By the end of provisional fellowship training, should only require level 4 supervision equivalent to mini-CEX global scores greater than six.
Introductory training (first six months); basic training (next 18 months); advanced training (next 12 months); provisional fellowship training (final 12 months).
Who is eligible to submit workplace-based assessments (WBAs)?
WBA assessors can be any Fellow of ANZCA or other specialist appointed by an ANZCA-accredited department or a provisional fellowship trainee in that department who works in the subject area appropriate for that WBA. Supervisors of training can nominate specialists to undertake workplace-based assessments by completing a WBA Assessor nomination form and submitting this to the College.
Multi-source feedback can and should be provided by WBA assessors, but can also be requested from patients, nursing staff, non-anaesthesia specialists or any other individuals who observe the trainee at work. Consolidated multi-source feedback can only be submitted into the training portfolio by the supervisor of training or a person designated by the supervisor of training so trainees will need to arrange for them feedback to be directed to these individuals.
The videos below have been created to be used in workplace-based assessment (WBA) workshops in Australia and New Zealand throughout 2012 and 2013. The videos show various trainee scenarios in the operating theatre and have been created to enable the viewer (an assessor) to become proficient with the mini-CEX workplace-based assessment.
- Workplace-based assessment (WBA) training video 1
- Workplace-based assessment (WBA) training video 2
- Workplace-based assessment (WBA) training video 3
- Workplace-based assessment (WBA) training video 4
A series of PowerPoints and templates are available below for the delivery of workshops in local departments. Please note that your user name and password will be needed to access the PowerPoint slides.
- PowerPoint slides for delivery of a two-hour workshop
These slides are to be used in addition to the above presentation
- PowerPoint slides for delivery of a three-hour workshop
The following two documents have been produced as templates to assist with the delivery of workshops
- Attendance Sheet
The attendance sheet should be completed for each workshop and sent to email@example.com